This entry is about photography and filters and camera sensors.
Most of you know that digital camera sensors can’t capture all the color and light a human eye can. It’s obvious when you take the picture, download it at home (or get it printed at a kiosk), and look at it. Quite obviously, it sucks. It’s flat! Look at this one, taken last weekend when I was out and about.
This has none of the beauty I saw when I was looking at the scene, even after I adjusted levels and curves. HDR is one way in which photographers process pictures to capture a wider range of lights and darks. “High dynamic range” is very popular right now, to the point where some cameras stack the exposures as you take the photos. I do it when I can but it’s not always a good time to drag out the tripod and get things set up.
So I sometimes use filters to mimic an HDR effect. Sometimes this works really well, other times it’s not very good. Shooting three different exposures and combining them in a program like Photoshop or Photomatix is always best.
After running that photo through Topaz Labs filters, I got this:
Which do you prefer? The filters brought out what little color was there and then I bumped it even more because I’m a color whore. I need to see a rich palette when I gaze at photos. Monochromes do little for me.
Note: I did do some masking between the two before calling it finished. I enjoy tweaking photos, you may not. ;/