Hongchao Zhang (Washington University in St. Louis), Zhouchi Li (Worcester Polytechnic Institute), Shiyu Cheng (Washington University in St. Louis), Andrew Clark (Washington University in St. Louis)

GM AutoDriving Security Award Winner ($1,000 cash prize)!

Autonomous vehicles rely on LiDAR sensors to detect obstacles such as pedestrians, other vehicles, and fixed infrastructures. LiDAR spoofing attacks have been demonstrated that either create erroneous obstacles or prevent detection of real obstacles, resulting in unsafe driving behaviors. In this paper, we propose an approach to detect and mitigate LiDAR spoofing attacks by leveraging LiDAR scan data from other neighboring vehicles. This approach exploits the fact that spoofing attacks can typically only be mounted on one vehicle at a time, and introduce additional points into the victim’s scan that can be readily detected by comparison from other, non-modified scans. We develop a Fault Detection, Identification, and Isolation procedure that identifies non-existing obstacle, physical removal, and adversarial object attacks, while also estimating the actual locations of obstacles. We propose a control algorithm that guarantees that these estimated object locations are avoided. We validate our framework using a CARLA simulation study, in which we verify that our FDII algorithm correctly detects each attack pattern.

View More Papers

CANtropy: Time Series Feature Extraction-Based Intrusion Detection Systems for...

Md Hasan Shahriar, Wenjing Lou, Y. Thomas Hou (Virginia Polytechnic Institute and State University)

Read More

Exploiting Transport Protocol Vulnerabilities in SAE J1939 Networks

Rik Chatterjee, Subhojeet Mukherjee, Jeremy Daily (Colorado State University)

Read More

podft: On Accelerating Dynamic Taint Analysis with Precise Path...

Zhiyou Tian (Xidian University), Cong Sun (Xidian University), Dongrui Zeng (Palo Alto Networks), Gang Tan (Pennsylvania State University)

Read More