Bo Yang (Zhejiang University), Yushi Cheng (Tsinghua University), Zizhi Jin (Zhejiang University), Xiaoyu Ji (Zhejiang University) and Wenyuan Xu (Zhejiang University)

Due to the booming of autonomous driving, in which LiDAR plays a critical role in the task of environment perception, its reliability issues have drawn much attention recently. LiDARs usually utilize deep neural models for 3D point cloud perception, which have been demonstrated to be vulnerable to imperceptible adversarial examples. However, prior work usually manipulates point clouds in the digital world without considering the physical working principle of the actual LiDAR. As a result, the generated adversarial point clouds may be realizable and effective in simulation but cannot be perceived by physical LiDARs. In this work, we introduce the physical principle of LiDARs and propose a new method for generating 3D adversarial point clouds in accord with it that can achieve two types of spoofing attacks: object hiding and object creating. We also evaluate the effectiveness of the proposed method with two 3D object detectors on the KITTI vision benchmark.

View More Papers

NC-Max: Breaking the Security-Performance Tradeoff in Nakamoto Consensus

Ren Zhang (Nervos), Dingwei Zhang (Nervos), Quake Wang (Nervos), Shichen Wu (School of Cyber Science and Technology, Shandong University), Jan Xie (Nervos), Bart Preneel (imec-COSIC, KU Leuven)

Read More

A Lightweight IoT Cryptojacking Detection Mechanism in Heterogeneous Smart...

Ege Tekiner (Florida International University), Abbas Acar (Florida International University), Selcuk Uluagac (Florida International University)

Read More

The Inconvenient Truths of Ground Truth for Binary Analysis

Jim Alves-Foss, Varsha Venugopal (University of Idaho)

Read More

ATTEQ-NN: Attention-based QoE-aware Evasive Backdoor Attacks

Xueluan Gong (Wuhan University), Yanjiao Chen (Zhejiang University), Jianshuo Dong (Wuhan University), Qian Wang (Wuhan University)

Read More