Analysing Eye Gaze Patterns during Confusion and Errors in Human–Agent Collaborations

Research output: Chapter in Book/Report/Conference proceedingConference paperpeer-review

5 Citations (Scopus)
355 Downloads (Pure)

Abstract

As human–agent collaborations become more prevalent, it is increasingly important for an agent to be able to adapt to their collaborator and explain their own behavior. In order to do so, they need to be able to identify critical states during the interaction that call for proactive clarifications or behavioral adaptations. In this paper, we explore whether the agent could infer such states from the human's eye gaze for which we compare gaze patterns across different situations in a collaborative task. Our findings show that the human's gaze patterns significantly differ between times at which the user is confused about the task, times at which the agent makes an error, and times of normal workflow. During errors the amount of gaze towards the agent increases, while during confusion the amount towards the environment increases. We conclude that these signals could tell the agent what and when to explain.
Original languageEnglish
Title of host publicationRO-MAN 2022 - 31st IEEE International Conference on Robot and Human Interactive Communication
Subtitle of host publicationSocial, Asocial, and Antisocial Robots
Pages224-229
Number of pages6
ISBN (Electronic)9781728188591
DOIs
Publication statusPublished - 2022

Publication series

NameRO-MAN 2022 - 31st IEEE International Conference on Robot and Human Interactive Communication: Social, Asocial, and Antisocial Robots

Fingerprint

Dive into the research topics of 'Analysing Eye Gaze Patterns during Confusion and Errors in Human–Agent Collaborations'. Together they form a unique fingerprint.

Cite this