The iPhone 11`s DeepFusion camera can be placed among the finest of smartphone cameras, and has revolutionized the Apple product line. The DeepFusion tech relies on the iPhones cutting edge A13 bionic processor, which allows the phone to take a variety of shots, which manages to create an image with less noise. In news that is bound to get the wider tech loving public excited; this passage will aim to break down how the tech will actually work.
To put the DeepFusion camera into layman terms this update is essentially Apples version of neural imaging. The new Apple VP Phil Schiller managed to explain this new update quite well. “It shoots nine images, before you press the shutter button it’s already shot four short images, four secondary images. When you press the shutter button it takes one long exposure, and then in just one second, the Neural Engine analyzes the fused combination of long and short images picking the best among them, selecting all the pixels, and pixel by pixel, going through 24 million pixels to optimize for detail and low noise.” This variety of photos manages to create a crisper quality image. This makes the camera perfect for mid to low lighting situations allowing the user a brighter and clearer image.
The DeepFusion camera is not yet on iPhone 11`s. It is quite surprisingly just a software update, this means that even if you were to buy a refurbished iPhone with an A13 bionic processor installed you would not have access to the camera feature. As it is a purely software-based update. It is extremely interesting that this tech simply relies on a software update. This means that the technology was already available on all iPhones, it was just waiting for the software to catch up. This may indicate that Apple already had plans to add the DeepFusion camera all along, and were simply waiting for the tech to catch up.