Apple Brings LiDAR to the iPhone 12 Pro for Augmented Reality Applications
Apple's inclusion of a LiDAR-capable sensor in the iPhone 12 Pro opened doors for augmented reality applications. But using LiDAR with AR is not without its design challenges.
Light detection and ranging, or LiDAR for short, is a popular method of creating a 3D map of a surrounding environment.
LiDAR works on a time-of-flight principle, where a signal is sent out from the sensor toward an object. The sensor then measures the time it takes for the signal to return and uses the speed of light to calculate how far away that object is. LiDAR applies this principle on a much larger scale, sending out laser pulses at a surface, sometimes up to 150,000 per second.
A LiDAR generated depth map. Image used courtesy of the US Geographical Survey and GCN
Using the time-of-flight information to calculate distance and repeating this in quick succession allows the sensor to build up a complex map of the surface it is measuring.
This technology has proven to be particularly useful in augmented reality (AR) applications, so much so that Apple's newest product, the iPhone 12 Pro, integrates a LiDAR-capable sensor for AR applications.
LiDAR and Augmented Reality
In AR applications, the ability to create a highly accurate and detailed depth map of an environment is extremely valuable. Less advanced AR systems are often limited by a single viewing angle, meaning features like depth are easily thrown off by poor lighting, which is an unavoidable variable in the real world.
LiDAR, on the other hand, has the ability to generate an accurate depth map regardless of the environment almost instantly. In this way, AR models can be developed within any given environment, allowing for more accurate depth perception and placement of virtual objects.
How Apple Has Merged LiDAR Into the iPhone 12 Pro
Apple claims it has introduced LiDAR in its newest iPhone, which may allow the device to create extremely realistic AR applications. For example, a more accurate depth map also allows the iPhone 12 Pro to better understand which objects are in front of others, meaning AR characters and objects will be accurately oriented in their environment.
Apple's LiDAR enables "Night mode" and, combined with the company's A14 Bionic chip, is said to deliver six times faster focus in low-light environments.
Image captured with the iPhone 12 LiDAR scanner. Image used courtesy of Apple
Forbes contributor Sabbir Rangwala sheds more light on the details of Apple's LiDAR technology, explaining that it operates at the 8XX nm wavelength. In addition, it implements photon-counting detectors—namely, single-photon avalanche photodiodes or "SPADs"—and vertical-cavity surface-emitting lasers (VCSELs).
The iPhone 12 Pro LiDAR uses an architecture (flash illumination and VCSELs) that reflects a focus on size, space constraints, and cost. Rangwala claims the LiDAR "has a range of ~ 5m and a limited FoV—perfectly adequate for the types of consumer applications that it is meant to promote."
The Design Challenges of LiDAR for AR
While LiDAR proposes huge benefits to AR, it does not come without some serious design challenges. For starters, the power required and the heat generated by LiDAR systems are of serious concern to LiDAR system designers.
This is because the avalanche photodiodes and lasers used in LiDAR systems can, in some cases, require hundreds of volts to operate. Besides the large power consumption that comes with these high operating voltages, hundreds of volts is not a realistic operating point inside an embedded device like a smartphone.
The electrical architecture of LiDAR. Image used courtesy of Analog Devices
Beyond this, the SNR of the system affects the range of LiDAR, normally limiting it to 100 m to 300 m. This involves the noise floor of the ADC used to convert the received pulses into a digital domain where the time-of-flight calculations can occur. While this range is acceptable in many applications, it can also be limiting in others.
LiDAR in Everyday Devices?
LiDAR is an older technology finding a new purpose today thanks to its benefits to AR systems. Apple's move to add LiDAR capabilities to its newest hardware is a step forward in adding AR capabilities to everyday devices. Many believe that this advance may be a precursor to an AR headset from Apple in the near future.
While the design challenges in LiDAR systems listed are not nearly exhaustive, there are many hurdles that still must be overcome before LiDAR can be universally accepted into the world of AR. Still, its benefits outweigh its challenges, and in the world of AR, it may be a technology that becomes more ubiquitous as time goes on.