Hier finden Sie wissenschaftliche Publikationen aus den Fraunhofer-Instituten.

Emotion Expression from Different Angles: A Video Database for Facial Expressions of Actors Shot by a Camera Array

: Seuss, D.; Dieckmann, A.; Hassan, T.; Garbas, J.-U.; Ellgring, J.H.; Mortillaro, M.; Scherer, K.


Institute of Electrical and Electronics Engineers -IEEE-:
8th International Conference on Affective Computing and Intelligent Interaction, ACII 2019 : Cambridge, United Kingdom, 3-6 September 2019
Piscataway, NJ: IEEE, 2019
ISBN: 978-1-7281-3889-3
ISBN: 978-1-7281-3888-6
International Conference on Affective Computing and Intelligent Interaction (ACII) <8, 2019, Cambridge>
Fraunhofer IIS ()

Over the last few decades, there has been an increasing call in the field of computer vision to use machine-learning techniques for the detection, categorization, and indexing of facial behaviors, as well as for the recognition of emotion phenomena. Automated Facial Expression Analysis has become a highly attractive field of competition for academic laboratories, startups and large technology corporations. This paper introduces the new Actor Study Database to address the resulting need for reliable benchmark datasets. The focus of the database is to provide real multi-view data, that is not synthesized through perspective distortion. The database contains 68-minutes of high-quality videos of facial expressions performed by 21 actors. The videos are synchronously recorded from five different angles. The actors' tasks ranged from displaying specific Action Units and their combinations at different intensities to enactment of a variety of emotion scenarios. Over 1.5 million frames have been annotated and validated with the Facial Action Coding System by certified FACS coders. These attributes make the Actor Study Database particularly applicable in machine recognition studies as well as in psychological research into affective phenomena-whether prototypical basic emotions or subtle emotional responses. Two state-of-the-art systems were used to produce benchmark results for all five different views that this new database encompasses. The database is publicly available for non-commercial research.