Projects per year
Abstract
To be able to react to interaction ruptures such as errors, a robot needs a way of realizing such a rupture occurred. We test whether it is possible to detect interaction ruptures from the user’s anonymized speech, posture, and facial features. We showcase how to approach this task, presenting a time series classifcation pipeline that works well with various machine learning models. A sliding window is applied to the data and the continuously updated predictions make it suitable for detecting ruptures in real-time. Our best model, an ensemble of MiniRocket classifers, is the winning approach to the ICMI ERR@HRI challenge. A feature importance analysis shows that the model heavily relies on speaker diarization data that indicates who spoke when. Posture data, on the other hand, impedes performance. Our code is available online.
Original language | English |
---|---|
Title of host publication | ICMI 2024 - Proceedings of the 26th International Conference on Multimodal Interaction |
Publisher | ACM |
Pages | 657-665 |
Number of pages | 9 |
ISBN (Electronic) | 9798400704628 |
DOIs | |
Publication status | Published - 4 Nov 2024 |
Publication series
Name | ACM International Conference Proceeding Series |
---|
Keywords
- hri
- robotics
- machine learning
- time series
- multimodal
Fingerprint
Dive into the research topics of 'A Time Series Classification Pipeline for Detecting Interaction Ruptures in HRI Based on User Reactions'. Together they form a unique fingerprint.Projects
- 2 Finished
-
Plan and Goal Reasoning for Explainable Autonomous Robots
Coles, A. (Primary Investigator) & Canal, G. (Co-Investigator)
RAE Royal Academy of Engineering
1/11/2021 → 31/10/2024
Project: Research
-
COHERENT - COllaborative HiErarchical Robotic ExplaNaTions
Coles, A. (Primary Investigator)
EPSRC Engineering and Physical Sciences Research Council
1/04/2021 → 31/03/2025
Project: Research