Apple is reportedly close to replacing the Sony-made camera sensors found across today’s iPhone lineup with a fully custom, in-house design. According to new information shared by Fixed Focus Digital, the Cupertino company has already built working prototypes that deliver up to 20 stops of dynamic range – a figure that rivals what the human eye can perceive, and is much higher than the 13-14 stops of dynamic that current smartphones cameras are capable of.

The move continues Apple’s broader strategy of vertical integration. Designing its own silicon for Mac and iPhone processors, wireless chips, and soon modems has given Apple tighter control over performance, security, and supply chains. By bringing the camera sensor under the same roof, Apple can tune every step of the imaging pipeline, from photon capture to final photo processing, in tandem with its Photonic Engine and A-series neural cores.
A recent patent filing titled “Image Sensor With Stacked Pixels Having High Dynamic Range And Low Noise” outlines a two-layer architecture. The top sensor die is responsible for collecting light, while a separate logic die underneath handles exposure control and on-chip noise reduction. This stacked approach increases light efficiency without enlarging the camera bump.
Within each pixel, Apple’s engineers use a lateral overflow integration capacitor (LOFIC) that automatically diverts excess electrons when a scene is too bright. Highlights and shadows can therefore be recorded simultaneously, removing the need for multiple exposures. The result is cleaner HDR photos where skies retain texture and faces remain properly exposed.
Noise handling has been rethought as well. Every pixel includes dedicated memory that records and subtracts thermal noise in real time, cutting grain in low-light shots before software processing even begins. Apple says this hardware-level correction can preserve delicate details that are often lost when traditional algorithms step in later.
Early prototypes are allegedly being tested in development hardware, hinting that an Apple image sensor could debut as early as the iPhone 18 Pro cycle. If the timeline slips, the technology is still expected to arrive well before the end of the decade and should integrate seamlessly with Apple’s custom lenses, Photonic Engine, and ProRAW workflows.
For users, the promise is straightforward: brighter highlights, cleaner shadows, and faster, more power-efficient image capture. For Apple, ditching Sony would reduce component costs and lock in another critical technology advantage. Taken together, the Apple image sensor project signals a major leap forward for iPhone photography and a reinforcement of the company’s self-reliant hardware roadmap.
via MacRumors