Christopher DiPalma, Ningfei Wang, Takami Sato, and Qi Alfred Chen (UC Irvine)

Robust perception is crucial for autonomous vehicle security. In this work, we design a practical adversarial patch attack against camera-based obstacle detection. We identify that the back of a box truck is an effective attack vector. We also improve attack robustness by considering a variety of input frames associated with the attack scenario. This demo includes videos that show our attack can cause end-to-end consequences on a representative autonomous driving system in a simulator.

View More Papers

DNS Privacy Vs : Confronting protocol design trade offs...

Mallory Knodel (Center for Democracy and Technology), Shivan Sahib (Salesforce)

Read More

Demo #8: Identifying Drones Based on Visual Tokens

Ben Nassi (Ben-Gurion University of the Negev), Elad Feldman (Ben-Gurion University of the Negev), Aviel Levy (Ben-Gurion University of the Negev), Yaron Pirutin (Ben-Gurion University of the Negev), Asaf Shabtai (Ben-Gurion University of the Negev), Ryusuke Masuoka (Fujitsu System Integration Laboratories) and Yuval Elovici (Ben-Gurion University of the Negev)

Read More

DRIVETRUTH: Automated Autonomous Driving Dataset Generation for Security Applications

Raymond Muller (Purdue University), Yanmao Man (University of Arizona), Z. Berkay Celik (Purdue University), Ming Li (University of Arizona) and Ryan Gerdes (Virginia Tech)

Read More