Real-time gesture recognition using a particle filtering approach
In this paper we present an approach for real-time gesture recognition using exclusively 1D sensor data, based on the use of Particle Filters and Dynamic Time Warping Barycenter Averaging (DBA). In a training phase, sensor records of users performing different gestures are acquired. For each gesture, the associated sensor records are then processed by the DBA method to produce one average record called template gesture. Once trained, our system classifies one gesture performed in real-time, by computing -using particle filters-an estimation of its probability of belonging to each class, based on the comparison of the sensor values acquired in real-time to those of the template gestures. Our method is tested on the accelerometer data of the Multimodal Human Activities Dataset (MHAD) using the Leave-One-Out cross validation, and compared with state-of-the-art approaches (SVM, Neural Networks) adapted for real-time gesture recognition. It manages to achieve a 85.30% average accuracy and outperform the others, without the need to define hyper-parameters whose choice could be restrained by real-time implementation considerations.