Inside the iPhone 11 Camera, Part 1: A Completely New Camera

 

Last month, we took a look at what is new in the iPhone 11 and 11 Pro’s camera hardware. You might’ve noticed two things from Apple’s iPhone announcement event and our blog post: the hardware changes seem fairly modest, with more attention directed at this generation’s software based processing.

It’s true: The great advances in camera quality for these new iPhones are mostly to blame on advanced (and improved) software processing.

I’ve taken some time to analyze the iPhone 11’s new image capture pipeline, and it looks like one of the greatest changes in iPhone cameras yet.

What is a photo?

That sounds like we’re off to a rather philosophical start, or delivering the punchline of an iPad photography commercial, but to highlight what makes the iPhone 11 camera unique we have to understand our expectations of photography.

For a while now, you haven’t been the one taking your photos. That’s not a slight at you, dear reader: When your finger touches the shutter button, to reduce perceived latency, the iPhone grabs a photo it has already taken before you even touched the screen.

This is done by starting a sort of rolling buffer of shots as soon as you open the Camera app. Once you tap the shutter, the iPhone picks the sharpest shot from that buffer. It saves a shot you, the user unaware of this skullduggery, assumes you have taken. Nope. You merely provided a hint, to help the camera pick from the many shots it had taken on its own.

We can argue this is still deliberate photography. Without your action, there would be no photo.

The resulting image is not a single exposure, either. The iPhone takes a multitude of exposures — going by Apple’s presentation, a whole lot— and intelligently merges them.

Last year’s iPhone XS introduced Smart HDR, which combined over- and underexposed shots to ensure your image has more detail in the shadows and highlights. HDR is shorthand for ‘High Dynamic Range’, and for decades it’s been something of a white whale in photographic technology.

Smart HDR Gets Smarter

Ever since humans put silver halide (hey, that’s our app’s name) on a plate to capture a still image, we have been frustrated by a photograph’s limited ability to capture details in both light and dark areas.

It’s hard to properly expose the building in both the foreground and background. The image on the right resembles what our eyes see.

Our eyes are actually really good at this sort of thing. It’s estimated they have 20 ‘stops’ of dynamic range. As a comparison, one of the latest and greatest cameras, the Sony A7R4, can capture 15 stops. iPhone 11 is around 10 stops.

Long before digital, film photographers ran into the same problems when they went from film negative (13 stops) to paper (8 stops). They would dodge and burn the image when developing to get everything within a printable range.

“Tetons and the Snake River, Grand Teton National Park,” 1942, Ansel Adams. This shot required old school post-processing to get that HDR look.

Today, a skilled photographer can recover dynamic range from a single RAW photo through editing. It’s just a manual process, and in extreme situations you’ll run up against limits of the sensor.