Moneycontrol
HomeTechnologyApple research reveals how AI could transform extreme low-light iPhone photography

Apple research reveals how AI could transform extreme low-light iPhone photography

Apple researchers have demonstrated how integrating an AI diffusion model directly into the camera pipeline could dramatically improve extreme low-light photos by recovering detail from raw sensor data.

December 21, 2025 / 14:33 IST
Story continues below Advertisement
Apple iPhone

Apple researchers have outlined a new approach to low-light photography that could significantly improve how iPhones capture images in near-darkness. The study explores how artificial intelligence can be embedded directly into the camera’s image signal processor, allowing the system to recover detail from raw sensor data that would otherwise be lost in extreme lighting conditions.

Low-light photography has long been one of the hardest problems in mobile imaging. When a camera sensor does not receive enough light, images often end up filled with digital noise, muddy colours and smeared textures. To compensate, smartphone makers rely heavily on computational photography techniques that brighten scenes and suppress noise. While effective in many cases, these methods are frequently criticised for producing overly smooth results, where fine textures disappear and complex details turn into flat, oil-painting-like surfaces.

Story continues below Advertisement

What the research focuses on

The new research tackles this limitation by rethinking where AI should be applied in the imaging pipeline. Instead of using artificial intelligence only after the image has already been processed, Apple’s researchers propose integrating an AI model directly into the camera’s core processing workflow. The model they developed, known as DarkDiff, is designed to enhance extremely dark raw images before critical detail is lost.