Table of Contents Author Guidelines Submit a Manuscript
The Scientific World Journal
Volume 2014 (2014), Article ID 484873, 14 pages
Research Article

Towards Emotion Detection in Educational Scenarios from Facial Expressions and Body Movements through Multimodal Approaches

aDeNu Research Group, Artificial Intelligence Department, UNED, Calle Juan del Rosal 16, 28040 Madrid, Spain

Received 31 August 2013; Accepted 11 March 2014; Published 22 April 2014

Academic Editors: J. Shu and F. Yu

Copyright © 2014 Mar Saneiro et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.


We report current findings when considering video recordings of facial expressions and body movements to provide affective personalized support in an educational context from an enriched multimodal emotion detection approach. In particular, we describe an annotation methodology to tag facial expression and body movements that conform to changes in the affective states of learners while dealing with cognitive tasks in a learning process. The ultimate goal is to combine these annotations with additional affective information collected during experimental learning sessions from different sources such as qualitative, self-reported, physiological, and behavioral information. These data altogether are to train data mining algorithms that serve to automatically identify changes in the learners’ affective states when dealing with cognitive tasks which help to provide emotional personalized support.