Automatic Analysis of Affective Postures and Body Motion to Detect Engagement with a Game Companion

TitleAutomatic Analysis of Affective Postures and Body Motion to Detect Engagement with a Game Companion
Publication TypeConference Paper
Year of Publication2011
AuthorsSanghvi J, Castellano G, Leite I, Pereira A, McOwan PW, Paiva A
Refereed DesignationRefereed
Conference NameACM/IEEE International Conference on Human-Robot Interaction
Conference LocationLausanne, Switzerland
Keywordsaffect recognition, affective body movement and posture, human-robot interaction, lirec, robot companions
AbstractThe design of an affect recognition system for socially perceptive robots relies on representative data: human-robot interaction in naturalistic settings requires an affect recognition system to be trained and validated with contextualised affective expressions, that is, expressions that emerge in the same interaction scenario of the target application. In this paper we propose an initial computational model to automatically analyse human postures and body motion to detect engagement of children playing chess with an iCat robot that acts as a game companion. Our approach is based on vision-based automatic extraction of expressive postural features from videos capturing the behaviour of the children from a lateral view. An initial evaluation, conducted by training several recognition models with contextualised affective postural expressions, suggests that patterns of postural behaviour can be used to accurately predict the engagement of the children with the robot, thus making our approach suitable for integration into an affect recognition system for a game companion in a real world scenario.
Posted by Ginevra on Wednesday, 16 February, 2011 /