Zhisheng Hu (Baidu), Shengjian Guo (Baidu) and Kang Li (Baidu)

In this demo, we disclose a potential bug in the Tesla Full Self-Driving (FSD) software. A vulnerable FSD vehicle can be deterministically tricked to run a red light. Attackers can cause a victim vehicle to behave in such ways without tampering or interfering with any sensors or physically accessing the vehicle. We infer that such behavior is caused by Tesla FSD’s decision system failing to take latest perception signals once it enters a specific mode. We call such problematic behavior Pringles Syndrome. Our study on multiple other autonomous driving implementations shows that this failed state update is a common failure pattern that specially needs attentions in autonomous driving software tests and developments.

View More Papers

MIRROR: Model Inversion for Deep LearningNetwork with High Fidelity

Shengwei An (Purdue University), Guanhong Tao (Purdue University), Qiuling Xu (Purdue University), Yingqi Liu (Purdue University), Guangyu Shen (Purdue University); Yuan Yao (Nanjing University), Jingwei Xu (Nanjing University), Xiangyu Zhang (Purdue University)

Read More

COOPER: Testing the Binding Code of Scripting Languages with...

Peng Xu (TCA/SKLCS, Institute of Software, Chinese Academy of Sciences; University of Chinese Academy of Sciences), Yanhao Wang (QI-ANXIN Technology Research Institute), Hong Hu (Pennsylvania State University), Purui Su (TCA/SKLCS, Institute of Software, Chinese Academy of Sciences; School of Cyber Security, University of Chinese Academy of Sciences)

Read More

Above and Beyond: Organizational Efforts to Complement U.S. Digital...

Rock Stevens (University of Maryland), Faris Bugra Kokulu (Arizona State University), Adam Doupé (Arizona State University), Michelle L. Mazurek (University of Maryland)

Read More