Zhisheng Hu (Baidu), Shengjian Guo (Baidu) and Kang Li (Baidu)

In this demo, we disclose a potential bug in the Tesla Full Self-Driving (FSD) software. A vulnerable FSD vehicle can be deterministically tricked to run a red light. Attackers can cause a victim vehicle to behave in such ways without tampering or interfering with any sensors or physically accessing the vehicle. We infer that such behavior is caused by Tesla FSD’s decision system failing to take latest perception signals once it enters a specific mode. We call such problematic behavior Pringles Syndrome. Our study on multiple other autonomous driving implementations shows that this failed state update is a common failure pattern that specially needs attentions in autonomous driving software tests and developments.

View More Papers

Demo #14: In-Vehicle Communication Using Named Data Networking

Zachariah Threet (Tennessee Tech), Christos Papadopoulos (University of Memphis), Proyash Poddar (Florida International University), Alex Afanasyev (Florida International University), William Lambert (Tennessee Tech), Haley Burnell (Tennessee Tech), Sheikh Ghafoor (Tennessee Tech) and Susmit Shannigrahi (Tennessee Tech)

Read More

Demo #4: Recovering Autonomous Robotic Vehicles from Physical Attacks

Pritam Dash (University of British Columbia) and Karthik Pattabiraman (University of British Columbia)

Read More

A Lightweight IoT Cryptojacking Detection Mechanism in Heterogeneous Smart...

Ege Tekiner (Florida International University), Abbas Acar (Florida International University), Selcuk Uluagac (Florida International University)

Read More