King's College London

Research portal

Group-level arousal and valence recognition in static images: Face, body and context

Research output: Chapter in Book/Report/Conference proceedingConference paper

Wenxuan Mou, Oya Celiktutan, Hatice Gunes

Original languageEnglish
Title of host publicationIEEE International Conference and Workshops on Automatic Face and Gesture Recognition (FG)
PublisherIEEE
Publication statusPublished - 2015

Documents

King's Authors

Abstract

Automatic analysis of affect has become a well-established research area in the last two decades. However, little attention has been paid to analysing the affect expressed by a group of people in a scene or an interaction setting, either in the form of the individual group member's affect or the overall affect expressed collectively. In this paper, we (i) introduce a framework for analysing an image that contains multiple people and recognizing the arousal and valence expressed at the group-level; (ii) present a dataset of images annotated along arousal and valence dimensions; and (iii) extract and evaluate a multitude of face, body and context features. We conduct a set of experiments to classify the overall affect expressed at the group-level along arousal (high, medium, low) and valence (positive, neutral, negative) using k-Nearest Neighbour classifier and integrate the information provided by the face, body and context features using decision level fusion. Our experimental results show the viability of the proposed framework compared to other in-the-wild recognition works - we obtain 54% and 55% recognition accuracy for individual arousal and valence dimensions, respectively.

View graph of relations

© 2018 King's College London | Strand | London WC2R 2LS | England | United Kingdom | Tel +44 (0)20 7836 5454