Zhongyuan Hau, Kenneth Co, Soteris Demetriou, and Emil Lupu (Imperial College London)

Best Short Paper Award Runner-up!

LiDARs play a critical role in Autonomous Vehicles’ (AVs) perception and their safe operations. Recent works have demonstrated that it is possible to spoof LiDAR return signals to elicit fake objects. In this work we demonstrate how the same physical capabilities can be used to mount a new, even more dangerous class of attacks, namely Object Removal Attacks (ORAs). ORAs aim to force 3D object detectors to fail. We leverage the default setting of LiDARs that record a single return signal per direction to perturb point clouds in the region of interest (RoI) of 3D objects. By injecting illegitimate points behind the target object, we effectively shift points away from the target objects’ RoIs. Our initial results using a simple random point selection strategy show that the attack is effective in degrading the performance of commonly used 3D object detection models.

View More Papers

POSEIDON: Privacy-Preserving Federated Neural Network Learning

Sinem Sav (EPFL), Apostolos Pyrgelis (EPFL), Juan Ramón Troncoso-Pastoriza (EPFL), David Froelicher (EPFL), Jean-Philippe Bossuat (EPFL), Joao Sa Sousa (EPFL), Jean-Pierre Hubaux (EPFL)

Read More

Debunking Exposure Notification

Serge Vaudenay, EPFL, Switzerland

Read More

Processing Dangerous Paths – On Security and Privacy of...

Jens Müller (Ruhr University Bochum), Dominik Noss (Ruhr University Bochum), Christian Mainka (Ruhr University Bochum), Vladislav Mladenov (Ruhr University Bochum), Jörg Schwenk (Ruhr University Bochum)

Read More

Demo #7: Automated Tracking System For LiDAR Spoofing Attacks...

Yulong Cao, Jiaxiang Ma, Kevin Fu (University of Michigan), Sara Rampazzi (University of Florida), and Z. Morley Mao (University of Michigan) Best Demo Award Runner-up ($200 cash prize)!

Read More