Zhisheng Hu (Baidu), Shengjian Guo (Baidu) and Kang Li (Baidu)

In this demo, we disclose a potential bug in the Tesla Full Self-Driving (FSD) software. A vulnerable FSD vehicle can be deterministically tricked to run a red light. Attackers can cause a victim vehicle to behave in such ways without tampering or interfering with any sensors or physically accessing the vehicle. We infer that such behavior is caused by Tesla FSD’s decision system failing to take latest perception signals once it enters a specific mode. We call such problematic behavior Pringles Syndrome. Our study on multiple other autonomous driving implementations shows that this failed state update is a common failure pattern that specially needs attentions in autonomous driving software tests and developments.

View More Papers

Log4shell: Redefining the Web Attack Surface

Douglas Everson (Clemson University), Long Cheng (Clemson University), and Zhenkai Zhang (Clemson University)

Read More

PHYjacking: Physical Input Hijacking for Zero-Permission Authorization Attacks on...

Xianbo Wang (The Chinese University of Hong Kong), Shangcheng Shi (The Chinese University of Hong Kong), Yikang Chen (The Chinese University of Hong Kong), Wing Cheong Lau (The Chinese University of Hong Kong)

Read More

CANCloak: Deceiving Two ECUs with One Frame

Li Yue, Zheming Li, Tingting Yin, and Chao Zhang (Tsinghua University)

Read More