The megapixel frenzy will resume in 2022. After launching the first smartphones with a 108 MP lens in late 2019, the Motorola Edge 30 Ultra and Mi 12T Pro hit the 200 MP mark this month. Even this year’s iPhone 14 Pro has been upgraded to a 48 MP primary sensor, which is unheard of at Apple. But the real interest in higher-resolution sensors is in the quality of the zoom in the first place.
The iPhone 14 Pro (Max) is the first iPhone with a 48 MP camera. Is this really reasonable? / © kwgeek
How about the x2 zoom on the iPhone 14 Pro?
Unlike traditional digital cameras, the vast majority of smartphones do not have zoom lenses for space reasons, which can change the focal length by moving the lens. Instead, they have their own sensors for different zoom levels. On the iPhone 14 Pro, these are the ultra-wide (x0.5), wide (x1), and telephoto (x3) lenses.
When the camera app is zoomed in, the smartphone digitally zooms in on the image of one lens until it reaches the zoom of the next lens, then switches. As the digital zoom increases, the quality naturally decreases. The degree of reduction depends on each target.
Here’s an effective sensor size comparison for the iPhone 14 Pro. / © kwgeek
The primary lens of the iPhone 14 Pro is a 1/1.28-inch image sensor that measures 9.8 x 7.3mm. The resolution is 48 MP. With 2x zoom, Apple simply cuts a 12 MP section in the middle of the sensor, measuring 4.9 x 3.7 mm, so roughly corresponds to a 1/3-inch format. Mathematically, these conditions are sufficient to get a correct photo in good lighting.
At 3x zoom, Apple then switches to a 12MP telephoto (which actually has a 3.2x zoom, 77mm equivalent focal length). At 1/3.5 inch, or 4.0 x 3.0 mm, the sensor is again a little smaller than the “x2 sensor” in the center of the main camera.
The figure below shows the sensor surface of the camera module at different focal lengths of the iPhone 14 Pro and iPhone 13 Pro. On the far left, on the edge, we start with the ultra-wide (13mm). With the primary lens (24mm equivalent), there is a noticeable upward jump. Up until the telephoto lens (77mm equivalent), the iPhone 14 Pro still has more sensor surface available than the iPhone 13 Pro. Finally, the telephoto lens is the same.
Here you can see what the camera sensor area is for the iPhone 14 Pro and iPhone 13 Pro at what focal lengths. / © kwgeek
The image above is also interesting, but uses megapixels instead of sensor area on the vertical axis. While the evolution of 12 MP each for the ultra-wide and telephoto lenses is the same, the jump to 48 MP is clearly felt. The iPhone 14 Pro clearly has more resolution reserve when it comes to digital zoom between x1 and x3.2.
The iPhone 14 Pro also has significantly more resolution reserves. / © kwgeek
What about size and resolution?
The iPhone 14 Pro, which has been talked about so far, is not even the smartphone with the highest resolution and largest sensor. Last week, Xiaomi launched the Mi 12T Pro with a 200MP sensor and ditched the telephoto lens entirely. But how much extra space does so many megapixels provide for digital zoom? Let’s compare with the iPhone 14 Pro:
The iPhone 14 Pro also has a lot of reservations in terms of resolution/© kwgeek.
However, aside from resolution, the most important factor is the sensor area available to the lens at different focal lengths. With a size of 1/1.22 inches, the Isocell HP1 equipped with the Xiaomi Mi 12T Pro is definitely much larger than the main lens of the iPhone 14 Pro, but it still loses in terms of the sensor surface.
The main lens of the Mi 12T Pro has a larger sensor area. However, the iPhone 14 Pro surpasses it again with its telephoto lens. / © kwgeek
What about Quad-Bayer?
We’re missing one factor: the color mask above the sensor. To explain Quad-Bayer etc, we must first understand how image sensors work. Image sensors consist of small light sensors that only measure the amount of incident light without distinguishing between colors. A 12 MP sensor equals 12 million such light sensors.
To convert this black and white sensor to a color sensor, color mask Placed on the sensor to filter green, red or blue incident light. The Bayer mask used on most image sensors always splits two pixels into two green pixels, one red and one blue. So the 12 MP sensor has 6 million green pixels and 3 million blue and red pixels.
A Bayer sensor (left) has twice as many green pixels as red and blue pixels. During demosaicing, the resulting gaps are filled and RGB pixels are interpolated / © Sony.
During this period demosaicing Where go to bayer, the image processing algorithm derives the RGB value of each pixel from the brightness values of surrounding pixels of different colors. So a very light green pixel surrounded by “dark” blue and red pixels becomes fully green. Green pixels next to fully exposed blue and red pixels become white. And so on until an image of 12 million RGB pixels is obtained.
However, for higher resolution sensors, the color mask is different. On so-called quad-bayer sensors, typically around 50 MP, there are four luminance pixels under each red, green, or blue pixel. The 108 MP sensor even groups 9 (3×3) pixels together under one color surface, while the 200 MP sensor has 16 (4×4) pixels. At Sony, this is called quad, and at Samsung, it’s called quad, nine, or quad.
The first sensor with a Quad-Bayer face shield is the Sony IMX586, shown in the OnePlus 7 demo / © kwgeek.
So the image sensor actually has a resolution of up to 200 MP in terms of brightness, while the color matte stops at 12 MP. This is also not a problem, because in perception, luminance resolution is more important than color resolution. However, with extremely high digital zooms, the color resolution eventually degrades to the point where image errors occur.
For example, here we process a photo of a small robot. On the left (1) you can see a grayscale image and on the right (3) an RGB image with a resolution divided by 4. We pieced together the middle image from the left and right images, and it looked pretty good at first glance. However, on closer inspection, the transition between green and blue at the top of the robot isn’t obvious. It is at the level of this transition that artifacts appear when zooming in too much on a sensor where the resolution of the color mask is lower than the sensor itself. It may also be for this reason that Samsung uses a 64 MP sensor with an atypical RGB matrix on its much-maligned x1.1 telephoto lens on the Galaxy S20 (Plus) and S21 (Plus) to achieve this resolution.
Luminance resolution is more important than color resolution: this central image (2) is generated from a high-resolution black and white image (1) and a low-resolution color image (3) / © kwgeek
Ultimately, it’s hard to know from the datasheet how the image quality of a lens is, especially since the manufacturer’s algorithm plays a decisive role. This”Re-mosaic” Also a big challenge. This is because for sensors with 2×2, 3×3 or 4×4 Bayer masks, unlike normal demosaicing, the color values must be interpolated over a larger sensor area and with greater complexity.
On the other hand, larger sensors can also cause problems. To keep lenses compact, manufacturers must use lenses that refract light more extreme, causing chromatic aberration and other artifacts at the edges of the image. The shallow depth of field is also a problem for close-ups.
So the suspense remains the same, and I hope you find this journey into the world of very large and super high-resolution image sensors interesting. What does your dream photo module look like in a smartphone?