With its Galaxy S20 Ultra smartphone, Samsung broke the 108-megapixel barrier on a phone camera for the first time, but the company doesn’t plan to stop there.
In a blog post on the company’s website, Samsung’s head of sensor business team Yongin Park outlines — in broad terms — the company’s ambitious plans for image sensors, and some of the obstacles it needs to overcome to achieve them.
“To fit millions of pixels in today’s smartphones that feature other cutting-edge specs like high screen-to-body ratios and slim designs, pixels inevitably have to shrink so that sensors can be as compact as possible. On the flip side, smaller pixels can result in fuzzy or dull pictures, due to the smaller area through which each pixel receives light information. The impasse between the number of pixels a sensor has and pixels’ sizes has become a balancing act that requires solid technological prowess,” he explains.
Samsung has been able to balance between these two issues with its Isocell tech, which isolates pixels with a unique material to prevent the light from escaping to neighbouring pixels. The company later introduced Tetracell and Nonacell technologies, which uses 2×2 and 3×3 pixel arrays to increase the amount of light absorption on individual pixels. Finally, Samsung also reduced the pixel size to 0.7μm; something Park claims was widely believed to be impossible.
So, how far will Samsung go? Park mentions a 600-megapixel sensor, though he doesn’t offer many details. It’s not just about smartphones, he says; major applications for image sensors are “expected to expand soon into other rapidly-emerging fields such as autonomous vehicles, IoT and drones.”
Interestingly, this means that in the future, these sensors will be in some ways more powerful than our eyes, which Park says match a resolution of “around 500 megapixels.” In the future, “we might even have sensors that can see microbes not visible to the naked eye,” he says.