King's College London

Research portal

Deceptive Storytelling in Artificial Dialogue Games

Research output: Chapter in Book/Report/Conference proceedingConference paperpeer-review

Original languageEnglish
Title of host publicationProceedings of the AAAI 2019 Spring Symposium
Subtitle of host publicationStory-Enabled Intelligence
Number of pages8
Accepted/In press5 Feb 2019

Documents

  • AAAI_SpringSym_Stories

    AAAI_SpringSym_Stories.pdf, 222 KB, application/pdf

    Uploaded date:13 Feb 2019

    Version:Accepted author manuscript

King's Authors

Abstract

The development of machines that can tell stories in order to interact with humans or other artificial agents has significant implications in the area of trust and AI. Even more so if we expect such machines to be transparent and explain their reasoning when we interrogate them to see if they should be held accountable. One of these implications is the ability of machines to use stories in order to deceive others, thus undermining the relation of trust between humans and machines. In this paper we explore from the perspective of an argumentation-based dialogue game what it means for a machine to deceive by telling stories

Download statistics

No data available

View graph of relations

© 2020 King's College London | Strand | London WC2R 2LS | England | United Kingdom | Tel +44 (0)20 7836 5454