Apple executives share the design philosophy of the iPhone 12 cameras

The iPhone 12 camera system has received major updates this year. Across the new flagship series, Apple has introduced new and advanced standards; all the models are powered by the fastest smartphone processor A14 Bionic chip, support 5G network, OLED display, and new camera system. Apple’s Product Line Manager of iPhone, Francesca Sweet, and Vice President of Camera Software Engineering, Jon MacCormack shared the design philosophy behind the new iPhone 12 camera system in an interview with PetaPixel.

Now the iPhone 12 cameras Wide and Ultra-wide lens support Night mode, TrueDepth camera, 4K video recording up to 240 fps, image stabilization, and much more. However, the Pro models have a LiDAR scanner, and the iPhone 12 Pro Max ships with an even better camera system with 47 percent main camera sensor, 2.5x Telephoto Zoom, 5x optical zoom. Here are the behind the scenes details of the iPhone 12 camera development by Apple.

“We don’t tend to think of a single axis like ‘if we go and do this kind of thing to hardware’ then a magical thing will happen. Since we design everything from the lens to the GPU and CPU, we actually get to have many more places that we can do innovation.”

iPhone 12 camera

Apple Shares Design Philosophy of the iPhone 12 Camera

The need to take distractions away from a capture worthy moment for users was the main goal behind the development process. Having said that, Jon McCormack also added that it was not only about sensor and lenses but also optimizing the fastest smartphone A14 Bionic processing chip, and software behind computational photography. He said,

“As photographers, we tend to have to think a lot about things like ISO, subject motion, et cetera. And Apple wants to take that away to allow people to stay in the moment, take a great photo, and get back to what they’re doing.

We replicate as much as we can to what the photographer will do in post. There are two sides to taking a photo: the exposure, and how you develop it afterwards. We use a lot of computational photography in exposure, but more and more in post and doing that automatically for you. The goal of this is to make photographs that look more true to life, to replicate what it was like to actually be there.”

iPhone 12 camera

McCormack explained that Apple has used machine learning and Smart HDR 3 technology to capture details in ideal surroundings and lighting as well as the difficult ones like foreground, clothing, skies and low-light places.

“The background, foreground, eyes, lips, hair, skin, clothing, skies. We process all these independently like you would in Lightroom with a bunch of local adjustments. We adjust everything from exposure, contrast, and saturation, and combine them all together.

Skies are notoriously hard to really get right, and Smart HDR 3 allows us to segment out the sky and treat it completely independently and then blend it back in to more faithfully recreate what it was like to actually be there.”

iPhone 12 camera

He also talked about the upcoming ProRAW technology in the iPhone 12 camera of both Pro models. Built on computational photography, the new technology is exclusively designed to give photographers complete control of the captured photos “in-camera, in real-time.”

Read Also:

About the Author

Addicted to social media and in love with iPhone, started blogging as a hobby. And now it's my passion for every day is a new learning experience. Hopefully, manufacturers will continue to use innovative solutions and we will keep on letting you know about them.

2 comments

Leave a comment