Zhisheng Hu (Baidu), Shengjian Guo (Baidu) and Kang Li (Baidu)

In this demo, we disclose a potential bug in the Tesla Full Self-Driving (FSD) software. A vulnerable FSD vehicle can be deterministically tricked to run a red light. Attackers can cause a victim vehicle to behave in such ways without tampering or interfering with any sensors or physically accessing the vehicle. We infer that such behavior is caused by Tesla FSD’s decision system failing to take latest perception signals once it enters a specific mode. We call such problematic behavior Pringles Syndrome. Our study on multiple other autonomous driving implementations shows that this failed state update is a common failure pattern that specially needs attentions in autonomous driving software tests and developments.

View More Papers

FANDEMIC: Firmware Attack Construction and Deployment on Power Management...

Ryan Tsang (University of California, Davis), Doreen Joseph (University of California, Davis), Qiushi Wu (University of California, Davis), Soheil Salehi (University of California, Davis), Nadir Carreon (University of Arizona), Prasant Mohapatra (University of California, Davis), Houman Homayoun (University of California, Davis)

Read More

Too Afraid to Drive: Systematic Discovery of Semantic DoS...

Ziwen Wan (University of California, Irvine), Junjie Shen (University of California, Irvine), Jalen Chuang (University of California, Irvine), Xin Xia (The University of California, Los Angeles), Joshua Garcia (University of California, Irvine), Jiaqi Ma (The University of California, Los Angeles), Qi Alfred Chen (University of California, Irvine)

Read More