Zhisheng Hu (Baidu), Shengjian Guo (Baidu) and Kang Li (Baidu)

In this demo, we disclose a potential bug in the Tesla Full Self-Driving (FSD) software. A vulnerable FSD vehicle can be deterministically tricked to run a red light. Attackers can cause a victim vehicle to behave in such ways without tampering or interfering with any sensors or physically accessing the vehicle. We infer that such behavior is caused by Tesla FSD’s decision system failing to take latest perception signals once it enters a specific mode. We call such problematic behavior Pringles Syndrome. Our study on multiple other autonomous driving implementations shows that this failed state update is a common failure pattern that specially needs attentions in autonomous driving software tests and developments.

View More Papers

PickMail: A Serious Game for Email Phishing Awareness Training

Gokul CJ (TCS Research, Tata Consultancy Services Ltd., Pune), Vijayanand Banahatti (TCS Research, Tata Consultancy Services Ltd., Pune), Sachin Lodha (TCS Research, Tata Consultancy Services Ltd., Pune)

Read More

Low-risk Privacy-preserving Electric Vehicle Charging with Payments

Andreas Unterweger, Fabian Knirsch, Clemens Brunner and Dominik Engel (Center for Secure Energy Informatics, Salzburg University of Applied Sciences, Puch bei Hallein, Austria)

Read More

Generating Test Suites for GPU Instruction Sets through Mutation...

Shoham Shitrit(University of Rochester) and Sreepathi Pai (University of Rochester)

Read More