Zhisheng Hu (Baidu), Shengjian Guo (Baidu) and Kang Li (Baidu)

In this demo, we disclose a potential bug in the Tesla Full Self-Driving (FSD) software. A vulnerable FSD vehicle can be deterministically tricked to run a red light. Attackers can cause a victim vehicle to behave in such ways without tampering or interfering with any sensors or physically accessing the vehicle. We infer that such behavior is caused by Tesla FSD’s decision system failing to take latest perception signals once it enters a specific mode. We call such problematic behavior Pringles Syndrome. Our study on multiple other autonomous driving implementations shows that this failed state update is a common failure pattern that specially needs attentions in autonomous driving software tests and developments.

View More Papers

Fuzzing Configurations of Program Options

Zenong Zhang (University of Texas at Dallas), George Klees (University of Maryland), Eric Wang (Poolesville High School), Michael Hicks (University of Maryland), Shiyi Wei (University of Texas at Dallas)

Read More

Kasper: Scanning for Generalized Transient Execution Gadgets in the...

Brian Johannesmeyer (VU Amsterdam), Jakob Koschel (VU Amsterdam), Kaveh Razavi (ETH Zurich), Herbert Bos (VU Amsterdam), Cristiano Giuffrida (VU Amsterdam)

Read More

SpiralSpy: Exploring a Stealthy and Practical Covert Channel to...

Zhengxiong Li (University at Buffalo, SUNY), Baicheng Chen (University at Buffalo), Xingyu Chen (University at Buffalo), Huining Li (SUNY University at Buffalo), Chenhan Xu (University at Buffalo, SUNY), Feng Lin (Zhejiang University), Chris Xiaoxuan Lu (University of Edinburgh), Kui Ren (Zhejiang University), Wenyao Xu (SUNY Buffalo)

Read More