Why Are Conversational Assistants Still Black Boxes? The Case For Transparency

Research output: Chapter in Book/Report/Conference proceedingPoster abstractpeer-review

115 Downloads (Pure)

Abstract

Much has been written about privacy in the context of conversa- tional and voice assistants. Yet, there have been remarkably few developments in terms of the actual privacy offered by these devices. But how much of this is due to the technical and design limitations of speech as an interaction modality? In this paper, we set out to re- frame the discussion on why commercial conversational assistants do not offer meaningful privacy and transparency by demonstrating how they could. By instrumenting the open-source voice assistant Mycroft to capture audit trails for data access, we demonstrate how such functionality could be integrated into big players in the sector like Alexa and Google Assistant. We show that this problem can be solved with existing technology and open standards and is thus fundamentally a business decision rather than a technical limitation.
Original languageEnglish
Title of host publicationACM Conversational User Interfaces (CUI)
ISBN (Electronic)9798400700149
DOIs
Publication statusPublished - 19 Jul 2023

Publication series

NameProceedings of the 5th International Conference on Conversational User Interfaces, CUI 2023

Keywords

  • conversational assistants
  • voice assistants
  • provenance
  • audit trails
  • personal data
  • Mycroft
  • privacy
  • transparency

Fingerprint

Dive into the research topics of 'Why Are Conversational Assistants Still Black Boxes? The Case For Transparency'. Together they form a unique fingerprint.

Cite this