Towards Autonomous Collaborative Robots that Adapt and Explain

Research output: Chapter in Book/Report/Conference proceedingConference paperpeer-review

124 Downloads (Pure)

Abstract

As we will increasingly encounter robots in our everyday lives, engage with them in social interactions, and collaborate on common tasks, it is important we endow them with the capability of adapting to our abilities and preferences. Moreover, we want them to be able to explain the decisions they make in collaborative tasks to maximise rapport and build trust and acceptance. Current explanations are missing a focus on the individual user’s needs, which is why we want to learn when and what to explain in a collaboration. This paper proposes the roadmap that we plan to implement with the goal of inferring the user’s mental state from physiological and social signals in order to inform explanation generation and robot adaptation. We present preliminary results utilizing eye gaze and a planned framework that allows a collaborative robot to adapt to its human collaborator and tailor its explanations to them in order to minimise the confusion in an interaction.
Original languageEnglish
Title of host publicationIEEE ICRA 2022 Workshop on Prediction and Anticipation Reasoning in Human Robot Interaction
Publication statusPublished - 9 Jun 2022

Fingerprint

Dive into the research topics of 'Towards Autonomous Collaborative Robots that Adapt and Explain'. Together they form a unique fingerprint.

Cite this