iPhone 11 and iPhone 11 Pro Likely to Get Deep Fusion Camera in the Next iOS 13 Beta

iPhone 11 Pro

The much-touted Deep Fusion feature was not available on the iPhone 11 and iPhone 11 Pro after the launch. Deep Fusion technology is quintessential when it comes to medium-to-low light image processing on the new iPhones. Unlike Night Mode, Deep Fusion will be running in the background and is designed to offer a better image quality. Apple is expected to add Deep Fusion mode on the upcoming iOS 13 Beta.

Deep Fusion is expected to improve iPhone 11 and iPhone 11 Pro’s imaging quality in medium-low light. With the new feature enabled, iPhone 11 and 11 Pro will offer three modes that will be automatically applied.

Night Mode Vs Deep Fusion

Night Mode is positioned as a feature that can be toggled on/off. However, Deep Fusion will simply run in the background and there is no indication whatsoever, Apple says it has intentionally hidden the feature in the background so that you dont have to worry much about taking the best shot.

The Verge has outlined how Deep Fusion works.

  1. By the time you press the shutter button, the camera has already grabbed three frames at a fast shutter speed to freeze motion in the shot. When you press the shutter, it takes three additional shots and then one longer-exposure shot to capture detail.
  2. Those three regular shots and long-exposure shots are merged into what Apple calls a “synthetic long.” This is a major difference from Smart HDR.
  3. Deep Fusion picks the short-exposure image with the most detail and merges it with the synthetic long exposure. Unlike Smart HDR, Deep Fusion only merges these two frames, not more. These two images are also processed for noise differently than Smart HDR, in a way that’s better for Deep Fusion.
  4. The images are run through four detail processing steps, pixel by pixel, each tailored to increasing amounts of detail — the sky and walls are in the lowest band, while skin, hair, fabrics, and so on are the highest level. This generates a series of weightings for how to blend the two images — taking detail from one and tone, color, and luminance from the other. The final image is generated.

It is worth noting that Deep Fusion processing takes more time then Smart HDR. In other words, if you take a series of images and open camera roll then you will see a proxy image while Deep Fusion does its work in the background. Once done, a final version of the image will appear on the camera roll. Apple had said that Deep Fusion will appear later this fall. Meanwhile, you can take a look at Deep Fusion images sample shots by Apple.

Do you find any difference in the pictures processed using Deep Fusion Vs Normal mode? Let us know in the comments below.

[via The Verge]