It’s about the beauty in the volleyball players’ faces. While this is admittedly about cameras, even if you don’t care about sports or cameras, you owe it to yourself to go check out this slide show, which an astounding piece of reportage, damn the medium. For the camera geeks, the slideshow is illustrative material from Rob Galbraith’s write-up on Canon’s all-out assault on the state of the digicam art with their EOS-1D Mark III. But that’s not the magic. The magic is from The Online Photographer, a highly recommended photogeek blog, in particular the “Featured Comment” from Matthew Miller down at the bottom of So You Thought You Had Good Buffer Depth, and it asks a question: Why does a digital camera need a lens, anyhow? A lens is an expensive inflexible analog computer that locks you into one focal-length setting. If you capture the photons hitting the front of the lens, shouldn’t you be able to figure out the best focus later? Or has Mr. Miller got something wrong in the basic physics?
Comment feed for ongoing:
From: Paul Brown (Apr 24 2007, at 23:50)
You can't construct an in-focus image based on where photons hit the front of the lens. It would be equivalent to knowing the trajectory of a ball by knowing its position. You also need to know where that photon is headed.
[link]
From: bob (Apr 25 2007, at 00:09)
re refocusing after the event, one word: plenoptics
http://en.wikipedia.org/wiki/Plenoptic_Camera
[link]
From: Neil Dunn (Apr 25 2007, at 00:28)
Speaking of photos; have you bought a new camera yet?
[link]
From: Mark (Apr 25 2007, at 01:52)
Somehow, that no-lens prediction doesn't ring right to me.
"We don't have that yet, but it's a very safe prediction to say that we will."
Why not? If this is theoretically possible, it should be possible to prototype right now, on a bank of supercomputers, with a building sized solar collector panel, even with crappy quality, at least. Light/optics isn't exactly cutting edge, quantum physics type stuff. What sort of brilliant discoveries remain?
And without a lens, wouldn't every ray/particle/wave of light from every every point in the scene radiate out and hit every pixel in the sensor? Pinhole "lenses" would solve it, perhaps. But the f-stop falls to nothing. Perhaps supersensitive sensors will be developed? Or will "directional" sensors bedeveloped, where every pixel stores the angle of impact of each photon along with the value/color/saturation. Of course, that's a lens, in a way.
[link]
From: Sidharth Kuruvila (Apr 25 2007, at 01:52)
There is use for a capture everything camera. More for fixing the loss of fidelity at higher f-stops you mentioned in an earlier post.
This should be doable with current setup, have the lens iterate through all the possible focus points. I imagine there is software to to detect the bad, unfocussed bits to throw away.
The other option would be to build a sensor that can tell both the intensity and the direction of the light.
[link]
From: Jacek (Apr 25 2007, at 03:03)
It's easy to capture the intensity of all light coming to a single pixel sensor. The lens makes sure that the light on that pixel sensor comes from one point in the photographed space (assuming sharp focus). Without lens, each sensor pixel would have to distinguish between rays coming from different directions. Hard to do; dare I say, currently (and for quite some time yet) impossible in a hand-held megapixel device, unless the physicists (quantum mechanics physicists, likely) come up with something surprising. Note also how holography doesn't need lenses because it uses a thick film and interference, but it's generally restricted to one color. You (or Matthew Miller) want full-spectrum holography, basically. 8-)
[link]
From: Ken Hirsch (Apr 25 2007, at 06:26)
As subsequent comments pointed out, the sensors (in a normal camera) only record intensity information, not phase information. That information is just lost and can't be recovered by computation.
However, there are some interesting concepts being worked on:
http://www.sciencenews.org/articles/20070407/bob8.asp
[link]
From: John Cowan (Apr 25 2007, at 10:15)
The phase information (currently unavailable) is exactly what is needed to reconstruct directionality; they are two different ways of describing the same thing.
[link]
From: Jeremy Dunck (Apr 25 2007, at 11:42)
So, uh, I dunno anything about photons and post-hoc focusing, but I do know those were some awesome shots of passionate athletes.
[link]
From: Ryan Cousineau (Apr 25 2007, at 12:50)
Matt Trent gave a talk on some future directions for digital imaging at last year's BarCamp Vancouver. The talk (summarized here), discussed plenoptic cameras and several other ideas that are likely to cause us to rethink photography.
The key theme was that right now we are still in an era of simulating analog cameras, but the researchers have some pretty good ideas about what the future, "pure-digital" cameras will do.
Pentax already incorporates one simple, digital-clever feature on its K-series dSLRs, and I think you've mentioned it: ISO-priority metering settings. This trick is possible because with digital, you pick your ISO on a frame-by-frame basis, no pushing necessary.
One more thing: Matt's day job is working for a company that builds HDR monitors, which I suppose may constitute the future of looking at digital images.
[link]
From: Jason Watkins (Apr 25 2007, at 23:37)
Mr. Miller has it right, as do the skeptical comments above about needing directional information. It is possible to build prototypes.
I've suspected for a while that the future of cameras will be a large sensor with some soft of static micro-lense grid or a coded aperture. Nothing like a traditional lens outside of specialized telephoto needs.
Something more the form factor of a panel or book than what we traditionally associate with a camera. It'll produce a 4d video stream at a fixed maximum field of view that we can extract 2d images of lesser field of view and arbitrary focus out of. We will also be able to pull some depth information out of camera, enabling re-lighting and other complex processing. It'll likely be built using technology similar to lcd panel manufacturing.
These are my predictions... we'll see if history makes me a fool, but I think the basic bet on plenoptic video cameras of some form is extremely likely.
Links of interest:
http://graphics.stanford.edu/papers/lfcamera/
http://www.paulcarlisle.net/old/codedaperture.html
[link]