News

Researchers Blind and Deceive LiDAR in Autonomous Vehicles

November 12, 2022 by Ikimi .O

In recent years, researchers worldwide have found that LiDAR isn't a foolproof vision technology in autonomous vehicles.

LiDAR-based object detection systems have become the go-to sensor option for autonomous vehicles (AVs), proving effective even in bad weather and low-light conditions. However, researchers worldwide have identified certain vulnerabilities in LiDAR that may make AVs susceptible to attacks and spoofing techniques. These attacks have been found to blind LiDAR to certain objects and cause LiDAR to make incorrect predictions about those objects' locations.

 

Researchers have discovered new laser attacks that can delete pedestrians and other obstacles from an AV's view.

 

This article highlights some of these recent findings and explores how OEMs can guard against both physical and cyber LiDAR attacks.  

 

Physical Removal Attack "Tricks" LiDAR Object Detection

AV LiDAR systems are particularly susceptible to attacks that target object detection models. Researchers from the University of Florida recently identified a new attack that can delete certain data about obstacles, alter obstacle distance measurements, and create adversarial objects to confuse the systems. These attacks may be physical or virtual, posing a significant threat to LiDAR object detection systems and, by extension, AV users and pedestrians.

 

PRA schematic

PRA schematic. Image used courtesy of arXiv

 

Because LiDAR uses laser reflections to measure the distance of objects, the researchers hypothesized that attackers could confuse LiDAR systems by creating fake reflections. The team expertly timed an attack in which they shined lasers directly at a LiDAR system to "scramble" its object detection ability. This attack can cause the sensor to eliminate reflections from genuine obstacles and substitute them with fake ones. 

The team demonstrated the effectiveness of this physical removal attack (PRA) against three conventional AV obstacle detectors and observed a 45° attack capability. 

 

Frustum Attack Alters AV Perception of Object Location

Similar to this finding, researchers from Duke University and the University of Michigan found another LiDAR vulnerability using an attack strategy called a frustum attack. Using a laser gun, frustum attacks can alter an AV's perception of an obstacle's location.

A frustum attack undermines 3D point cloud LiDAR, which—under normal circumstances—uses sensors to collect numerous data points from an environment and depict them as dots throughout a 3D space. A frustum attack adds false data points in object detection systems, confusing the system into making hazardous decisions.

 

Depiction of a frustum attack

Depiction of a frustum attack. Image used courtesy of USENIX

 

While the researchers noted that it's unlikely attackers would physically plant laser guns along a road or on other vehicles to target highway commuters, they opined that military-level attackers could occur virtually to compromise several AVs at once with this technique. 

 

Black Box Algorithm Causes LiDAR to Overlook 3D-printed Obstacles

In 2019, a group of researchers from Baidu Research, the University of Michigan, and the University of Illinois at Urbana-Champaign tested whether they could plant 3D-printed adversarial objects that would jeopardize LiDAR object detection systems and cause collisions.

 

LiDAR's inability to detect an adversarial object.

LiDAR's inability to detect an adversarial object. Image used courtesy of LiDAR-Adv

 

This attack used an "evolution-based black box attack algorithm" on the researchers' target AI—Baidu’s home-grown autonomous driving platform called Apollo—from the outside. The researchers also tested their attack in a white box setting in which they slightly changed input images with "gradients" to move the output in a predicted direction.

The team hid specific targets and changed labels of samples before testing their attacks in both simulated and real-world settings. They found that their attack algorithm not only deceived LiDAR in simulated environments but also caused Apollo not to detect their 3D-printed adversarial samples in the real world.

 

Precautions Against LiDAR Vulnerabilities

OEMs can take certain design-level precautions to fortify LiDAR security and efficacy. For instance, in light of their PRA attack, University of Florida researchers recommended that developers update LiDAR sensors and the software that interprets raw data. OEMs must train their LiDAR software to identify telltale signatures of spoofed LiDAR reflections. Larger optical setups (e.g., magnifying lenses) and advanced tracking systems can increase LiDAR's receptive fields and accurately trail obstacles, significantly mitigating a PRA attack.

OEMs can also incorporate extra stereo cameras with overlapping fields of view at multiple locations of AVs to account for frustum attacks, accurately measure object distance, and ultimately maximize user and pedestrian safety.