[Technical Tear-down] Why Honor View20 48MP Camera Superior Than Others?

shashank1320

Well-known member
Feb 21, 2013
71
0
0
Visit site
n7ervGC.jpg

Smartphone photography has made tremendous progress in recent years, especially in camera sensors -- the soul of any digital camera. Camera sensors determine the clarity of your photos and how large you can scale them up. From 300,000 pixels many years ago, to today’s highest 48 million pixels, sensors have advanced at an astonishing speed, and smartphone cameras now rival even high-performance SLR cameras.
Sony IMX586 and Samsung GM1 are the only two 48-megapixel sensors in smartphones on the market today. Although both are technically 48MP sensors, their capabilities are not equal, as image quality depends not only on the number of pixels, but also how the pixels are arranged. When this is taken into account, Sony IMX586 has the edge, as it is able to transform its pixel structure to achieve true 48MP photography. On the other hand, Samsung GM1 behaves more like a 12MP sensor due to its less effective pixel arrangement.
To understand the difference, we need to understand how a sensor captures light and creates an image. There are millions of picture elements (pixels) on an image sensor. Pixels can detect the intensity of light, but they cannot distinguish colors. An image sensor without color filters can only produce black-and-white photos.
To produce color photos, the RGB color model is used – in which red, green and blue light are added together in various ways to reproduce a broad array of colors. One method is to put red, green, and blue filters in front of three image sensors respectively and combine the images to create one photo. But this is not practical for smartphones, as it makes the cameras very bulky and expensive.
BYvzPQi.png

In 1974, Bryce Bayer invented the Bayer filter which employs what is called the "Bayer Pattern," a checkerboard-like arrangement of red, green, and blue pixels on a square grid of photosensors. This pattern of filters is placed on top of a single image sensor to make each pixel respond to either red, green, or blue light, allowing digital cameras to capture vivid color images. Then, a computing process known as demosaicing converts the detected image to a single full-color image. Most digital image sensors use the Bayer filter today.

lus2cWb.png

Both Sony and Samsung sensors use Bayer filter arrays to capture light and produce photos.
Traditional Bayer arrays use a 2x2 array (2 green, 1 blue and 1 red). The Sony IMX586 sensor uses the Quad Bayer color filter array, where adjacent 2x2 pixels are arranged in the same color, making high-sensitivity shooting possible. During low light shooting, the signals from the four adjacent pixels are added, raising the sensitivity to a level equivalent to that of 1.6 μm pixels (12 megapixels), resulting in bright, low noise images.
When shooting bright scenes, such as an outdoor environment during daytime, the built-in, original signal processing function performs array conversion (see diagram below), making it possible to obtain high-definition 48 effective megapixel images in real time. Thus, the Sony IMX586 can produce true 48MB photos.

y3KFmrB.png

Based upon the data revealed by Samsung, the GM1 sensor adopts a similar Quad Bayer color filter. But it lacks a conversion mechanism, so every 2x2 array can only recognize one color and output data together. As a result, while the Samsung GM1 sensor has 48 megapixels, it actually behaves like a 12-effective megapixel sensor. According to the specifications, the GM1 sensor can only support photo resolution up to 4000×3000, i.e. 12 megapixels.

a0RR6eZ.png

The main difference between the Sony IMX586 and Samsung GM1 sensors is that the former can produce true 48-megapixel photos by transforming its pixel structure, while the latter lacks this capability. In the end, photos produced by the Sony IMX586 are superior, having higher resolution and clarity, and retaining more details even after zooming.