Zhisheng Hu (Baidu), Shengjian Guo (Baidu) and Kang Li (Baidu)

In this demo, we disclose a potential bug in the Tesla Full Self-Driving (FSD) software. A vulnerable FSD vehicle can be deterministically tricked to run a red light. Attackers can cause a victim vehicle to behave in such ways without tampering or interfering with any sensors or physically accessing the vehicle. We infer that such behavior is caused by Tesla FSD’s decision system failing to take latest perception signals once it enters a specific mode. We call such problematic behavior Pringles Syndrome. Our study on multiple other autonomous driving implementations shows that this failed state update is a common failure pattern that specially needs attentions in autonomous driving software tests and developments.

View More Papers

Characterizing the Adoption of Security.txt Files and their Applications...

William Findlay (Carleton University) and AbdelRahman Abdou (Carleton University)

Read More

ScriptChecker: To Tame Third-party Script Execution With Task Capabilities

Wu Luo (Peking University), Xuhua Ding (Singapore Management University), Pengfei Wu (School of Computing, National University of Singapore), Xiaolei Zhang (Peking University), Qingni Shen (Peking University), Zhonghai Wu (Peking University)

Read More

Demo #13: Attacking LiDAR Semantic Segmentation in Autonomous Driving

Yi Zhu (State University of New York at Buffalo), Chenglin Miao (University of Georgia), Foad Hajiaghajani (State University of New York at Buffalo), Mengdi Huai (University of Virginia), Lu Su (Purdue University) and Chunming Qiao (State University of New York at Buffalo)

Read More