Zhisheng Hu (Baidu), Shengjian Guo (Baidu) and Kang Li (Baidu)

In this demo, we disclose a potential bug in the Tesla Full Self-Driving (FSD) software. A vulnerable FSD vehicle can be deterministically tricked to run a red light. Attackers can cause a victim vehicle to behave in such ways without tampering or interfering with any sensors or physically accessing the vehicle. We infer that such behavior is caused by Tesla FSD’s decision system failing to take latest perception signals once it enters a specific mode. We call such problematic behavior Pringles Syndrome. Our study on multiple other autonomous driving implementations shows that this failed state update is a common failure pattern that specially needs attentions in autonomous driving software tests and developments.

View More Papers

Hybrid Trust Multi-party Computation with Trusted Execution Environment

Pengfei Wu (School of Computing, National University of Singapore), Jianting Ning (College of Computer and Cyber Security, Fujian Normal University; Institute of Information Engineering, Chinese Academy of Sciences), Jiamin Shen (School of Computing, National University of Singapore), Hongbing Wang (School of Computing, National University of Singapore), Ee-Chien Chang (School of Computing, National University of Singapore)

Read More

Demo #2: Policy-based Discovery and Patching of Logic Bugs...

Hyungsub Kim (Purdue University), Muslum Ozgur Ozmen (Purdue University), Antonio Bianchi (Purdue University), Z. Berkay Celik (Purdue University) and Dongyan Xu (Purdue University)

Read More

Characterizing the Adoption of Security.txt Files and their Applications...

William Findlay (Carleton University) and AbdelRahman Abdou (Carleton University)

Read More

Fine-Grained Coverage-Based Fuzzing

Bernard Nongpoh (Université Paris Saclay), Marwan Nour (Université Paris Saclay), Michaël Marcozzi (Université Paris Saclay), Sébastien Bardin (Université Paris Saclay)

Read More