Zhisheng Hu (Baidu), Shengjian Guo (Baidu) and Kang Li (Baidu)

In this demo, we disclose a potential bug in the Tesla Full Self-Driving (FSD) software. A vulnerable FSD vehicle can be deterministically tricked to run a red light. Attackers can cause a victim vehicle to behave in such ways without tampering or interfering with any sensors or physically accessing the vehicle. We infer that such behavior is caused by Tesla FSD’s decision system failing to take latest perception signals once it enters a specific mode. We call such problematic behavior Pringles Syndrome. Our study on multiple other autonomous driving implementations shows that this failed state update is a common failure pattern that specially needs attentions in autonomous driving software tests and developments.

View More Papers

Probe the Proto: Measuring Client-Side Prototype Pollution Vulnerabilities of...

Zifeng Kang (Johns Hopkins University), Song Li (Johns Hopkins University), Yinzhi Cao (Johns Hopkins University)

Read More

ATTEQ-NN: Attention-based QoE-aware Evasive Backdoor Attacks

Xueluan Gong (Wuhan University), Yanjiao Chen (Zhejiang University), Jianshuo Dong (Wuhan University), Qian Wang (Wuhan University)

Read More

Clarion: Anonymous Communication from Multiparty Shuffling Protocols

Saba Eskandarian (University of North Carolina at Chapel Hill), Dan Boneh (Stanford University)

Read More

V2X Security: Status and Open Challenges

Jonathan Petit (Director Of Engineering at Qualcomm Technologies) Dr. Jonathan Petit is Director of Engineering at Qualcomm Technologies, Inc., where he leads research in security of connected and automated vehicles (CAV). His team works on designing security solutions, but also develops tools for automotive penetration testing and builds prototypes. His recent work on misbehavior protection…

Read More