What is the LiDAR scanner present on the iPhone 12 Pro and Pro Max?
With the new iPhone camera, Apple promises three things: photo and video effects, augmented reality, and scanning of objects and spaces. The last two go mainly to developers and businesses who are looking to create visual maps.
But a general user will be satisfied with the photo and video effects. What is the LiDAR scanner present on the iPhone 12 Pro and Pro Max? Beyond its application in augmented reality, the LiDAR sensor works by emitting lasers that measure distances. On the iPhone 12 Pro and Pro Max Apple is using it to improve autofocus. The company says that as a result the autofocus is 6 times faster in low light compared to a normal camera. The LiDAR sensor also enables better quality portrait photos in low light conditions. The concept of using a dedicated depth sensor is not new.
Many phones use depth sensors to improve the quality of portrait photos and we can mention OnePlus Nord. But these sensors work mainly in light environments. Some models like the Huawei P30 Pro and Samsung Galaxy S20 Ultra have a time-of-flight (ToF) sensor which uses infrared to read the field depth.
Both ToF and LiDAR scans the environment several meters away, but some versions of the latter measure the distances up to 100 meters. Usually, such sensors are used in cars. The advantage of the LiDAR scanner is that it sends small pulses of beam from one corner to the other to scan the environment. While ToF sends only one light pulse to measure space compromising accuracy.
Implementing a LiDAR sensor on the iPhone, developers will most likely be encouraged to find ways to use it primarily in augmented reality. Snapchat became the first app to launch lenses for the iOS app that uses LiDAR.