Skip to main content

What is Deep Fusion? How it works, and what photos look like without it

Apple’s Deep Fusion camera feature has made a lot of buzz before its official release in iOS 13.2. Now that it’s live, we’re taking a deeper look at what’s being fused, and how.

It’s not that con-Fusing

Much like Apple’s Smart HDR, Deep Fusion relies on object and scene recognition, as well as a series of eight images captured before you click the shutter button.

Recommended Videos

Of the eight images, four are taken with standard exposure, and four with short exposure. A ninth picture is then taken with a long exposure when the shutter button is triggered. The short exposure shots are meant to freeze time and bolster high-frequency details like grass blades or stubble on a person’s face. Therefore, the sharpest image of this series is chosen to move on to the next step.

Three of the standard-exposure shots which display the best color, tones, and other low-frequency data are then fused with the long-exposure frame to compose a single image. This image and the sharpest short-exposure frame are then sent through neural networks, which choose between these two images for the best pixel to ultimately represent this photo. This pixel-by-pixel analysis enhances your images by ultimately minimizing noise, sharpening details, and accurately coloring your photos, doing so on a very granular and intelligent level.

how deep fusion works
Genevieve Poblano/Digital Trends

All of the post-shutter processing is done behind the scenes, so it won’t impact your photo capture time. In other words, you can still snap back-to-back photos just as quickly as you ever could on the iPhone, and if they’re all using Deep Fusion, they’ll simply be queued up in the camera roll to be processed in order. Apple says you could go into your camera roll and potentially see an image still processing for a half-second, but I’ve yet to encounter this. Deep Fusion won’t work on burst mode, however, and it will only be available on the iPhone 11 and iPhone 11 Pro models.

It just works

Apple’s iconic mantra is the guiding principle for Deep Fusion’s nonintrusiveness. There’s no toggle to flip this on; it will enable itself when possible in during various lighting situations that vary by the lens you’re shooting with.

The ultrawide-angle lens, for instance, cannot take advantage of Deep Fusion at all. On the main lens, Deep Fusion kicks in for what Apple describes as “indoor lighting” or anything below twilight in outdoor settings — that is, if the iPhone doesn’t explicitly offer night mode. The telephoto lens uses Deep Fusion for anything that isn’t very dark or exceedingly bright, but keep in mind that darker situations usually disable the telephoto camera and kick over the responsibilities to the main sensor, which will then determine what to do — be it Deep Fusion, night mode, or Smart HDR.

The results

So far, Deep Fusion’s impacts have been mostly subtle in our testing. We tested it on our iPhone 11 Pro Max running iOS 13.2 against the iPhone 11 Pro running iOS 13.1 which is not equipped with Deep Fusion, and at first, it’s hard to see Deep Fusion’s influence. But zooming in on a picture did reveal areas where finer details were more defined and less smoothed-over. This isn’t something that’s going to jump out at you, especially looking at it on a phone, though. Of course, Deep Fusion being a part of the equation never produced a worse image than without it.

We did have a couple of instances where, if you zoomed in a little, you could appreciate the difference Deep Fusion makes, particularly in the finer details of an image. So, while it may not be as magical as night mode, it’s still better to have than to not.

Corey Gaskin
Associate Editor, Mobile
Corey’s technological obsession started as a teenager, lusting after the brand-new LG VX8300 flip phone. This led him to…
Things still aren’t looking good for Apple’s iOS 19 update
iPhone 16 Pro Max in Desert Titanium.

The latest version of iOS 18.2 rolled out to (most) iPhone users yesterday, and it brought with it a slew of new features that fans have eagerly waited for. These include Visual Intelligence for iPhone 16, Genmoji, and Image Playground. However, this slower rollout of iOS 18 features is having an impact on development times for its next iteration, and that means iOS 19 might be delayed.

There have been whispers of delays before, so this doesn't come as a huge surprise — particularly when you think about how the production flow at Apple usually goes. In a Threads post, Bloomberg's Mark Gurman said: "I continue to hear that the gradual rollout of features across iOS 18 to iOS 18.4 is leading to delays of some features scheduled for iOS 19. That will lead to a long-term rollout of features next cycle as well. Engineers are stuck working on iOS 18 projects when they’d usually already be on to the following OS."

Read more
I tried 4 of the best earbud and phone combos. Here’s which one you should use
The OnePlus Nord 4 and OnePlus Buds Pro 3, Google Pixel 9 Pro and Google Pixel Buds 3, Apple iPhone 16 Pro Max with Airpods Pro 2, and Samsung Galaxy S24 Ultra with Samsung Galaxy Buds3 Pro.

When you buy a smartphone from Apple, Samsung, Google, or OnePlus, there’s always going to be the temptation to get a matching set of wireless earbuds to go along with it, as each manufacturer makes its own pair. But what exactly does it mean when you stay loyal to the brand, and is it worth it?

I’ve used the latest phones and earbuds from each manufacturer to find out. Here's what you need to know — and which pair is the best.
What have I tested?
(From left) OnePlus Buds Pro 3, Samsung Galaxy Buds 3 Pro, Google Pixel Buds Pro 2, and Apple AirPods Pro 2 Andy Boxall / Digital Trends

Read more
This may be our first look at the iPhone 17 Pro’s massive redesign
Back of the iPhone 16 Pro next to the Pixel 9 Pro

The iPhone 17 has been the subject of quite a few leaks so far, but we might have just gotten our first look at its redesign. Spoiler alert: it's a big one. If this design is accurate, then Apple has completely shifted the way it positions cameras on the back of devices by opting for a horizontal side-by-side placement that makes the iPhone 17 look a lot like a Pixel device.

The suggested appearance was first leaked on Weibo, then noticed and shared on X by known tipster Jukanlosreve. The post shows what looks to be a frame, said to be part of the iPhone 17 supply chain. According to the Weibo post, the bar places the ultrawide angle lens in the middle to make room for the "front structured light." We assume this means FaceID. The post has been translated from Chinese to English, so a few details were lost in translation.

Read more