Apple has finally installed a new image sensor in the camera that has more pixels than previous iPhones. The main camera now comes with a 48-megapixel sensor, has an aperture of f/1.76 and optical image stabilization. The ultra-wide lens stays at 12-megapixels, but gets an aperture of f/2.2 instead of f/1.8 – so the aperture gets smaller, which is a downgrade. Both lenses have a larger sensor, which in turn should significantly increase the light output. The telephoto camera also remains at 12 megapixels – an optical image stabilizer is also used here.
Despite the high-resolution sensor, the iPhone 14 Pro Max still outputs photos with 12 megapixels as standard. The iPhone uses pixel binning. This practice, which has been common among all Android smartphone manufacturers for years, ensures that four pixels are combined into one. Among other things, more light is captured and dark areas should be better exposed accordingly. At the same time, the iPhone applies the “Deep Fusion” optimization introduced with the iPhone 11. Here the image processor calculates the best result from nine shots. Before the shutter button is even pressed, the camera already takes four pictures and four additional pictures are also taken after the picture is taken. This combination gets a new name from Apple – “Photonic Engine”.
By the way: thanks to the new 48-megapixel sensor, the camera app of the iPhone 14 Pro Max offers a double magnification in the preset zoom ranges in standard photo mode. In this case, the camera digitally zooms into the image section, but since it uses 48 megapixels as a basis and ultimately only outputs 12 megapixels, the quality is perfectly fine.
A small but fine upgrade to the front camera is commendable. Because Apple is now also installing an autofocus here, which ensures significantly better selfies.
Tag: iphone design, iphone 14, apple iphone, iphone release