ABSTRACT
In this paper we investigate the issue of sensitive information leakage to third-party voice applications in voice assistant ecosystems. We focus specifically on leakage of sensitive information via the conversational interface. We use a bespoke testing infrastructure to investigate leakage of sensitive information via the conversational interface of Google Actions and Alexa Skills. Our work augments prior work in this area to consider not only specific categories of personal data, but also other types of potentially sensitive information that may be disclosed in voice-based interactions with third-party voice applications. Our findings indicate that current privacy and security measures for third-party voice applications are not sufficient to prevent leakage of all types of sensitive information via the conversational interface. We make key recommendations for the redesign of voice assistant architectures to better prevent leakage of sensitive information via the conversational interface of third-party voice applications in the future.
- Noura Abdi, Kopo Ramokapane, and Jose Such. 2019. More than Smart Speakers: Security and Privacy Perceptions of Smart Home Personal Assistants. In Fifteenth USENIX Symposium on Usable Privacy and Security (SOUPS 2019). 451–466.Google Scholar
- Noura Abdi, Xiao Zhan, Kopo Ramokapane, and Jose Such. 2021. Privacy Norms for Smart Home Personal Assistants. In Proceedings of the ACM Conference on Human Factors in Computing Systems (CHI). 558:1–558:14.Google ScholarDigital Library
- Raquel Alvarez, Jake Levenson, Ryan Sheatsley, and Patrick McDaniel. 2019. Application Transiency: Towards a Fair Trade of Personal Information for Application Services. In International Conference on Security and Privacy in Communication Systems. Springer, 47–66.Google ScholarCross Ref
- Tom Bocklisch, Joey Faulkner, Nick Pawlowski, and Alan Nichol. 2017. Rasa: Open source language understanding and dialogue management. arXiv preprint arXiv:1712.05181(2017).Google Scholar
- Long Cheng, Christin Wilson, Song Liao, Jeffrey Young, Daniel Dong, and Hongxin Hu. 2020. Dangerous skills got certified: Measuring the trustworthiness of skill certification in voice personal assistant platforms. In Proceedings of the 2020 ACM SIGSAC Conference on Computer and Communications Security. 1699–1716.Google ScholarDigital Library
- Jide Edu, Xavier Ferrer-Aran, Jose Such, and Guillermo Suarez-Tangil. 2021. SkillVet: Automated Traceability Analysis of Amazon Alexa Skills. IEEE Transactions on Dependable and Secure Computing (2021).Google Scholar
- Jide Edu, Xavier Ferrer-Aran, Jose Such, and Guillermo Suarez-Tangil. 2022. Measuring Alexa Skill Privacy Practices across Three Years. In WWW ’22: The ACM Web Conference 2022, Virtual Event, Lyon, France, April 25 - 29, 2022. ACM, 670–680.Google ScholarDigital Library
- Jide Edu, Jose Such, and Guillermo Suarez-Tangil. 2021. Smart home personal assistants: a security and privacy review. ACM Computing Surveys (CSUR) 53, 6 (2021), 1–36.Google ScholarDigital Library
- Ryan G Ganzenmuller. 2014. Snap and destroy: Preservation issues for ephemeral communications. Buff. L. Rev. 62(2014), 1239.Google Scholar
- Zhixiu Guo, Zijin Lin, Pan Li, and Kai Chen. 2020. SkillExplorer: Understanding the Behavior of Skills in Large Scale. In 29th USENIX Security Symposium (USENIX Security 20). 2649–2666.Google Scholar
- David Major, Danny Yuxing Huang, Marshini Chetty, and Nick Feamster. 2021. Alexa, Who Am I Speaking To?: Understanding Users’ Ability to Identify Third-Party Apps on Amazon Alexa. ACM Transactions on Internet Technology (TOIT) 22, 1 (2021), 1–22.Google ScholarDigital Library
- Esther Shein. 2013. Ephemeral data. Commun. ACM 56, 9 (2013), 20–22.Google ScholarDigital Library
- Oren Soffer. 2016. The oral paradigm and Snapchat. Social Media+ Society 2, 3 (2016), 2056305116666306.Google Scholar
- Jose Such. 2017. Privacy and Autonomous Systems. In Proceedings of the International Joint Conference on Artificial Intelligence (IJCAI). 4761–4767.Google ScholarCross Ref
- Dawei Wang, Kai Chen, and Wei Wang. 2021. Demystifying the Vetting Process of Voice-controlled Skills on Markets. Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies 5, 3 (2021), 1–28.Google ScholarDigital Library
- Jeffrey Young, Song Liao, Long Cheng, Hongxin Hu, and Huixing Deng. 2022. SkillDetective: Automated Policy-Violation detection of voice assistant applications in the wild. In USENIX Security Symposium.Google Scholar
Index Terms
- Leakage of Sensitive Information to Third-Party Voice Applications
Recommendations
Can you meaningfully consent in eight seconds? Identifying Ethical Issues with Verbal Consent for Voice Assistants
CUI '22: Proceedings of the 4th Conference on Conversational User InterfacesDetermining how voice assistants should broker consent to share data with third party software has proven to be a complex problem. Devices often require users to switch to companion smartphone apps in order to navigate permissions menus for their ...
Misinformation in Third-party Voice Applications
CUI '23: Proceedings of the 5th International Conference on Conversational User InterfacesThis paper investigates the potential for spreading misinformation via third-party voice applications in voice assistant ecosystems such as Amazon Alexa and Google Assistant. Our work fills a gap in prior work on privacy issues associated with third-...
Detecting repackaged smartphone applications in third-party android marketplaces
CODASPY '12: Proceedings of the second ACM conference on Data and Application Security and PrivacyRecent years have witnessed incredible popularity and adoption of smartphones and mobile devices, which is accompanied by large amount and wide variety of feature-rich smartphone applications. These smartphone applications (or apps), typically organized ...
Comments