Zhisheng Hu (Baidu), Shengjian Guo (Baidu) and Kang Li (Baidu)

In this demo, we disclose a potential bug in the Tesla Full Self-Driving (FSD) software. A vulnerable FSD vehicle can be deterministically tricked to run a red light. Attackers can cause a victim vehicle to behave in such ways without tampering or interfering with any sensors or physically accessing the vehicle. We infer that such behavior is caused by Tesla FSD’s decision system failing to take latest perception signals once it enters a specific mode. We call such problematic behavior Pringles Syndrome. Our study on multiple other autonomous driving implementations shows that this failed state update is a common failure pattern that specially needs attentions in autonomous driving software tests and developments.

View More Papers

Detecting CAN Masquerade Attacks with Signal Clustering Similarity

Pablo Moriano (Oak Ridge National Laboratory), Robert A. Bridges (Oak Ridge National Laboratory) and Michael D. Iannacone (Oak Ridge National Laboratory)

Read More

Demo #4: Attacking Tesla Model X’s Autopilot Using Compromised...

Ben Nassi (Ben-Gurion University of the Negev), Yisroel Mirsky (Ben-Gurion University of the Negev, Georgia Tech), Dudi Nassi, Raz Ben Netanel (Ben-Gurion University of the Negev), Oleg Drokin (Independent Researcher), and Yuval Elovici (Ben-Gurion University of the Negev) Best Demo Award Winner ($300 cash prize)!

Read More

Impact Evaluation of Falsified Data Attacks on Connected Vehicle...

Shihong Huang (University of Michigan, Ann Arbor), Yiheng Feng (Purdue University), Wai Wong (University of Michigan, Ann Arbor), Qi Alfred Chen (UC Irvine), Z. Morley Mao and Henry X. Liu (University of Michigan, Ann Arbor) Best Paper Award Runner-up ($200 cash prize)!

Read More