Dynamic Gesture Recogition Using Neural Networks. A Fundament for Advanced Interaction Construction
Interaction in Virtual Reality environments is still a challenging task. Static hand posture recognition is currently the most common and widely used method for interaction using glove input devices. In order to improve the naturalness of interaction, and thereby decrease the user-interface learning time, there is a need to be able to recognize dynamic gestures. Dynamic Gesture Recognition (DGR) is difficult for various reasons. The large variations in the speed of execution of various phases of a gesture is one such reason. Another is the quality and position of the physical properties describing a gesture themselves. These problems are the exaggerated by the differences which arise when various people attempt the same gesture, as well as when the same person attempts repeated executions of the same gesture. Other factors effecting the difficulty of DRG are the emotional state of the person doing the gesture and the accuracy of the input device used. And finally, a large amount of dat a has to be processed in real time because of large variances in the length of time to execute a gesture. In this paper we describe our appoach to overcoming the difficulties of DGR using neural networks. Backpropagation neural networks have already proven themselves to be appropriate and efficient for posture recognition. However, the extensive amount of data involved in DGR requires a different approach. Because of features such as topology presentation and automatic-learning, Kohonen Feature Maps are particularly suitable for the reduction of the high dimension data space which is the result of a dynamic gesture, and are thus implemented for this task.