Options
2018
Conference Paper
Title
IMU based gesture recognition for mobile robot control using Online Lazy Neighborhood Graph search
Abstract
Robotic teleoperation in disaster areas or environments dangerous to humans is a growing research field. With recent advancements in the robotics, the complexity of its usages has also increased. Despite this fact, currently used technology limits the majority of man-machine interfaces to text or GUI based interfaces and joysticks. Such types of control can become cumbersome in case of for example robots with a heavy control box or high degrees of freedom. Hence alternate intuitive control paradigms need to be developed. Gesture-based control is particularly useful as it can be more intuitive. Vision-based gesture control is well researched but the setup time and dependency on controlled environmental conditions, like lighting, makes it less suitable for teleoperation in disaster areas. Hoffmann et al. developed an IMU based control for a robot manipulator[1]. They used five IMUs attached to the sleeve of a wearable-jacket and transferred human arm motions into corresponding robotic manipulator motions wirelessly. They showed that teleoperation performed in this way is very efficient and intuitive[2]. However, to trigger some pre-defined manipulator motion or to trigger the robot base motions this direct control method cannot be used. In this paper, we present and evaluate a framework for gesture recognition using IMUs to indirectly control a mobile robot. Six gestures, namely Sammeln (gather), Fahrzeug (drive), Verteilen (spread out), Runter (down), B eeilung (hurry) and Vorwrts (forwards) are defined. A novel algorithm based on OLNG (Online Lazy Neighborhood Graph) search is used to recognize the gestures. Online Lazy Neighborhood Graph is a data structure based on kd-trees n-nearest neighbors. Originally OLNG was suggested and implemented for motion reconstruction from sparse accelerometer data in the helm of computer graphics [3]. As it allows real-time and fast similarity searches in big motion databases to the given input signal, we use this algorithm to classify the gestures online and command the robot. To build up the database we ask the operator to perform each gesture five times wearing the jacket equipped with the IMUs. Acceleration data from the IMUs is stored in a database during this short training phase. When an external signal is applied, we calculate its n nearest neighbors and use OLNG to find the best matching sequence. A best-matched gesture, if existing is then returned. Experiments are conducted to find and validate the best parameters for our algorithm. In extensive experiments we show that with our selected set of six gestures the framework is able to identify gestures in real time with an average success rate of 84%. Keywords Gesture recognition, teleoperation, IMU, manipulator control, mobile robotics References [1] J¨org Hoffmann, Bernd Br¨uggemann, and Bj¨orn Kr¨uger. Automatic calibration of a motion capture system based on inertial sensors for tele-manipulation. In 7th International Conference on Informatics in Control, Automation and Robotics (ICINCO), June 2010. [2] B. Br¨uggemann, B. Gaspers, A. Ciossek, J. Pellenz, and N. Kroll. Comparison of different control methods for mobile manipulation using standardized tests. In 2013 IEEE International Symposium on Safety, Security, and Rescue Robotics (SSRR), pages 12, Oct 2013. [3] Jochen Tautges. Reconstruction of Human Motions Based on Low-Dimensional Control Signals. Dissertation, Universit¨at Bonn, August 2012.