Misinformation in Third-party Voice Applications

Jide Edu, Jose Such, Guillermo Suarez-Tangil, Mary Bispham, Suliman Kalim Sattar, Clara Zard

Research output: Chapter in Book/Report/Conference proceedingConference paperpeer-review

75 Downloads (Pure)

Abstract

This paper investigates the potential for spreading misinformation via third-party voice applications in voice assistant ecosystems such as Amazon Alexa and Google Assistant. Our work fills a gap in prior work on privacy issues associated with third-party voice applications, looking at security issues related to outputs from such applications rather than compromises to privacy from user inputs.
We define misinformation in the context of third-party voice applications and implement an infrastructure for testing third-party voice applications using automated natural language interaction. Using our infrastructure, we identify --- for the first time --- several instances of misinformation in third-party voice applications currently available on the Google Assistant and Amazon Alexa platforms. We then discuss the implications of our work for developing measures to pre-empt the threat of misinformation and other types of harmful content in third-party voice assistants becoming more significant in the future.
Original languageEnglish
Title of host publicationSpecial Interest Group on Computer-Human Interaction
PublisherACM
Number of pages6
Publication statusAccepted/In press - 28 May 2023
EventACM conference on Conversational User Interfaces - Eindhoven, Netherlands
Duration: 19 Jul 202321 Jul 2023
Conference number: 23

Conference

ConferenceACM conference on Conversational User Interfaces
Abbreviated title CUI
Country/TerritoryNetherlands
CityEindhoven
Period19/07/202321/07/2023

Fingerprint

Dive into the research topics of 'Misinformation in Third-party Voice Applications'. Together they form a unique fingerprint.

Cite this