Zhisheng Hu (Baidu), Shengjian Guo (Baidu) and Kang Li (Baidu)

In this demo, we disclose a potential bug in the Tesla Full Self-Driving (FSD) software. A vulnerable FSD vehicle can be deterministically tricked to run a red light. Attackers can cause a victim vehicle to behave in such ways without tampering or interfering with any sensors or physically accessing the vehicle. We infer that such behavior is caused by Tesla FSD’s decision system failing to take latest perception signals once it enters a specific mode. We call such problematic behavior Pringles Syndrome. Our study on multiple other autonomous driving implementations shows that this failed state update is a common failure pattern that specially needs attentions in autonomous driving software tests and developments.

View More Papers

Demo #9: Attacking Multi-Sensor Fusion based Localization in High-Level...

Junjie Shen, Jun Yeon Won, Zeyuan Chen and Qi Alfred Chen (UC Irvine)

Read More

Evaluating Susceptibility of VPN Implementations to DoS Attacks Using...

Fabio Streun (ETH Zurich), Joel Wanner (ETH Zurich), Adrian Perrig (ETH Zurich)

Read More

VISAS-Detecting GPS spoofing attacks against drones by analyzing camera's...

Barak Davidovich (Ben-Gurion University of the Negev), Ben Nassi (Ben-Gurion University of the Negev) and Yuval Elovici (Ben-Gurion University of the Negev)

Read More