Gaussian Process based Dynamic Facial Emotion Tracking
Capturing the emotions of humans is of paramount importance in human-machine interaction. Here, emotions are typically extracted from the human's face recorded in image sequences. In this paper, tracking emotions from images is formulated as Bayesian state estimation problem where the system state represents the valence-arousal space of emotions. Handcrafted image features are first mapped to the valence-arousal space by means of a Gaussian process. To allow dynamic emotion tracking, a Kalman filter is derived, where an inequality constraint on the emotional state is employed in order to avoid a drifting state. Experiments based on two well-known facial expression datasets are performed to demonstrate the performance of the proposed approach.