Is anyone out there an optics guy? The goal that I’ve wanted for years is to take a vanilla camera and get HDR images out of it without alignment or merging problems. Most of that time has been spent twiddling my thumbs. I’ve said this before and I’ll say it again: I’ll happily trade half of my resolution for a high and low image in the same frame. But after thinking about, I think this might be possible on a single camera with a few mirrors and a beam splitter.
It’s well known that HDR is possible using a typical 3D stereo beam splitter setup with your interocular distance set to zero. Here’s an example of a DIY beam splitter setup:
The theory is pretty simple. Here is a quick diagram using windows paint, which I always have and always will use for my diagrams.
That blue object is a beam splitter. When light hits the beam splitter, half the light goes straight through to one camera and the rest of the light reflects like a mirror to the other camera. By sliding the top camera left or right you can increase or decrease the stereo disparity (also known as your interocular distance). You can use this same technique for HDR quite easily.
The purple addition is Neutral Density filter. You can make one of the images 4-6 stops darker and then you have an HDR image. Of course this workflow is a huge pain because you have to keep all your setting aligned, manage two cameras, etc.
But what got me thinking was side-by-side stereo solutions. In particular, I saw this article: www.instructables.com/id/Make-a-3-D-Stereoscope-Slide-and-Video-Shooter/. In it, he uses mirrors to split the camera into a left and right image.
The final result looks like this:
The cool thing about it is that you only need to have one camera. This is much more important than it sounds. You don’t have to worry about sync for video. You can also use zoom lenses (you can’t use most zoom lenses for stereo because zoom lenses have minor variance in the manufacturing process that makes alignment virtually impossible). And there is less hassle to deal with.
Here is a rough sketch of this setup if it wasn’t clear from the photo. The pink lines are mirrors.
So here’s my question: Could you combine these two?
What if we could take a beam splitter two get two copies of the same view. And then use mirrors to put those two copies side by side on our camera’s sensor. Just add a Neutral Density filter to one half and then you have an HDR image.
First off, given that the foundations of optics have been around for centuries, I would think that splitting to get two copies of the same image right next to each other on a plane has to have been done before. But since I’m not an optics guy, here’s my hypothetical approach:
Once again, the Beam Splitter is BLUE, the Neutral Density filter is PURPLE, and the mirrors are PINK. Light comes in and gets split. Then it takes two paths to the sensor, with one of those paths getting darker along the way. So, would this actually work?
The one issue you would have with a setup like this is constant alignment. You would need to make sure that your camera is dead center all the time and not tilted. Otherwise you would have some problems with your HDR merge. You also have to make sure your mirrors don’t wiggle around.
Then I was inspired by these stereo lenses.
These lenses have a dead-simple workflow: You just attach the lens and then you magically have a stereo camera. Everything is manufactured within a high-enough tolerance that you don’t have to re-align when you attach or detach the camera. But could you do this with an HDR lens? That would be really awesome.
But here’s the real, real question: Could you make a side-by-side HDR lens extender? Here’s an example of making a DIY Macro Lens Extender from a Pringles can.
If you could make a side-by-side HDR lens extender, that would be the most awesome invention in the history of HDR. If this was on the shelf for $500 dollars I’d buy it in a heartbeat. You could take any SLR and any lens, even a wide angle, and have true HDR shots at the expense of half your sensor resolution. And if your two images aren’t perfectly aligned, you should be able to calibrate it once and then the merging process would be automatic every time. You would have to do some lens trickery to make sure that you aren’t extending the focal length of the camera but I think good optics people could figure that out.
What do you guys think? Would that work? And if so, is anyone up for trying it? This whole idea is just a hypothetical of course. But if you’re in the camera optics accessories business and you want to get rich beyond your wildest dreams…