Christopher Lentzsch (Ruhr-Universität Bochum), Anupam Das (North Carolina State University)

Background. Amazon’s voice-based assistant, Alexa, enables users to directly interact with various web services through natural language dialogues. It provides developers with the option to create third-party applications (known as `skills’) to run on top of Alexa. While such applications ease users’ interaction with smart devices and bolster a number of additional services, they also raise security and privacy concerns due to the personal setting they operate in. Moreover, the voice interface presents its own set of challenges to analyze/vet skills potentially accessing sensitive user data.

Aim. Our aim is to perform a systematic analysis of the Alexa skill ecosystem and shed light on several limitations that exist in the current skill vetting process.

Data. We perform the first large-scale analysis of Alexa skills, obtained from seven different skill stores (US, CA, DE, FR, JP, UK, AU) totaling to 90,194 unique skills.

Method. We adopt various methodologies to analyze the different aspects of the Alexa skill ecosystem. This includes, publishing our own skills to highlight gaps in the existing vetting process, activating skills through a semi-automated approach to understand how skills with similar invocation names are selected, and using state-of-the-art NPL techniques to determine privacy policy compliance.

Results. We show that not only can a malicious user publish a skill under any arbitrary developer/company name, but she can also make backend code changes after approval to coax users into revealing unwanted information. Next, we evaluate the efficacy of different skill-squatting techniques and find that while certain approaches are more favorable than others, there is no substantial abuse of skill squatting in the real world. Lastly, we find that 23.3% of skills do not fully disclose the data types associated with the permissions they request.

Conclusions. Through our investigations we see that while skills expand Alexa’s capabilities and functionalities, it also creates new security and privacy risks, especially as there are limitations in the current vetting process. Moving forward more research effort is required in developing fully automated approaches to analyze skills.

Speaker's biographies

Christopher Lentzsch is a PhD student at the Ruhr University Bochum. His research interests are privacy and data protection for IoT and web. His current focus is on intervention user interfaces (IUI) in smart home and IoT environments.

Anupam Das is an Assistant professor in the Computer Science Department at North Carolina State University (NCSU). Prior to joining the NCSU, he obtained his Ph.D. from the University of Illinois at Urbana-Champaign (UIUC) where he was a recipient of a Fulbright Science and Technology fellowship, and also worked as a postdoctoral fellow in the School of Computer Science at Carnegie Mellon University (CMU). His research interests lie in security and privacy with a special focus towards designing secure and privacy-preserving technologies. His current research direction focuses on exploring the security and privacy challenges in the era of Internet of Things (IoT), where he is focusing on designing systems that can help enhance transparency and control for consumers. He is a recipient of NSF CRII award (2019). He has also been recipient of ACM Distinguished Paper Awards (ASIACCS 2014, MMSys 2017). His projects have been covered by media outlets such as Wired, ZDnet, MotherBoard, and FastCompany.

View More Papers

Is Your Firmware Real or Re-Hosted? A case study...

Abraham A. Clements, Logan Carpenter, William A. Moeglein (Sandia National Laboratories), Christopher Wright (Purdue University)

Read More

Forward and Backward Private Conjunctive Searchable Symmetric Encryption

Sikhar Patranabis (ETH Zurich), Debdeep Mukhopadhyay (IIT Kharagpur)

Read More

Measuring DoT/DoH Blocking Using OONI Probe: a Preliminary Study

S. Basso (Open Observatory of Network Interference)

Read More

Who's Hosting the Block Party? Studying Third-Party Blockage of...

Marius Steffens (CISPA Helmholtz Center for Information Security), Marius Musch (TU Braunschweig), Martin Johns (TU Braunschweig), Ben Stock (CISPA Helmholtz Center for Information Security)

Read More