Zhisheng Hu (Baidu), Shengjian Guo (Baidu) and Kang Li (Baidu)

In this demo, we disclose a potential bug in the Tesla Full Self-Driving (FSD) software. A vulnerable FSD vehicle can be deterministically tricked to run a red light. Attackers can cause a victim vehicle to behave in such ways without tampering or interfering with any sensors or physically accessing the vehicle. We infer that such behavior is caused by Tesla FSD’s decision system failing to take latest perception signals once it enters a specific mode. We call such problematic behavior Pringles Syndrome. Our study on multiple other autonomous driving implementations shows that this failed state update is a common failure pattern that specially needs attentions in autonomous driving software tests and developments.

View More Papers

V-Range: Enabling Secure Ranging in 5G Wireless Networks

Mridula Singh (CISPA - Helmholtz Center for Information Security), Marc Roeschlin (ETH Zurich), Aanjhan Ranganathan (Northeastern University), Srdjan Capkun (ETH Zurich)

Read More

Demo #1: Curricular Reinforcement Learning for Robust Policy in...

Yunzhe Tian, Yike Li, Yingxiao Xiang, Wenjia Niu, Endong Tong, and Jiqiang Liu (Beijing Jiaotong University)

Read More

Above and Beyond: Organizational Efforts to Complement U.S. Digital...

Rock Stevens (University of Maryland), Faris Bugra Kokulu (Arizona State University), Adam Doupé (Arizona State University), Michelle L. Mazurek (University of Maryland)

Read More