Zhisheng Hu (Baidu), Shengjian Guo (Baidu) and Kang Li (Baidu)

In this demo, we disclose a potential bug in the Tesla Full Self-Driving (FSD) software. A vulnerable FSD vehicle can be deterministically tricked to run a red light. Attackers can cause a victim vehicle to behave in such ways without tampering or interfering with any sensors or physically accessing the vehicle. We infer that such behavior is caused by Tesla FSD’s decision system failing to take latest perception signals once it enters a specific mode. We call such problematic behavior Pringles Syndrome. Our study on multiple other autonomous driving implementations shows that this failed state update is a common failure pattern that specially needs attentions in autonomous driving software tests and developments.

View More Papers

An In-Depth Analysis on Adoption of Attack Mitigations in...

Ruotong Yu (Stevens Institute of Technology, University of Utah), Yuchen Zhang, Shan Huang (Stevens Institute of Technology)

Read More

DITTANY: Strength-Based Dynamic Information Flow Analysis Tool for x86...

Walid J. Ghandour, Clémentine Maurice (CNRS, CRIStAL)

Read More

Hazard Integrated: Understanding Security Risks in App Extensions to...

Mingming Zha (Indiana University Bloomington), Jice Wang (National Computer Network Intrusion Protection Center, University of Chinese Academy of Sciences), Yuhong Nan (Sun Yat-sen University), Xiaofeng Wang (Indiana Unversity Bloomington), Yuqing Zhang (National Computer Network Intrusion Protection Center, University of Chinese Academy of Sciences), Zelin Yang (National Computer Network Intrusion Protection Center, University of Chinese Academy…

Read More

Titanium: A Metadata-Hiding File-Sharing System with Malicious Security

Weikeng Chen (DZK/UC Berkeley), Thang Hoang (Virginia Tech), Jorge Guajardo (Robert Bosch Research and Technology Center), Attila A. Yavuz (University of South Florida)

Read More