High Dynamic Range (HDR) photos is one of the major new features introduced in iOS 4.1.
At Apple’s September 1 special event, Steve Jobs explained that HDR photos are created using 3 photos taken in rapid succession: one normal, one under-exposed, and one over-exposed. The photos are then combined using some pretty sophisticated algorithms to create an enhanced composite photo.
Folks at Gizmodo tested the new feature in iOS 4.1 to find out how the HDR photos look. Here are some of their observations:
While we’ve only played around with the feature for a while, we noticed that it seems to favor—or even privilege—highlights. This makes some of the HDR photos seem washed out next to their non-HDR counterparts while making the colors look more natural. It also means that the iPhone 4′s tricky white balance troubles are practically an issue of the past.
Gizmodo found that the HDR feature doesn’t work too well in places where there is low level lighting:
It’s worth nothing however, that the HDR feature does seem to struggle a bit in case of environments with low level lighting or situations where there isn’t a dynamic range to exploit. My bathroom appears to cause one of those situations because my poor rubber duckies confused the HDR feature and resulted in a funky, messed up image despite the iPhone being held perfectly still during the photo.
That little quirk aside, the HDR feature seems to lead to far more natural looking photos in general. But since it’s not always easy to predict when this will be the case, I recommend toggling the photo settings to save both the HDR version of a photo as well as the normally exposed version.
You can checkout some of the HDRs photos taken using an iPhone 4 running iOS 4.1:
I’m no expert when it comes to photography but the photos look much better with the HDR feature except for the photo in low level lighting.
Let us know what you think about the new HDR feature in iOS 4.1 in the comments section below.