Hier finden Sie wissenschaftliche Publikationen aus den Fraunhofer-Instituten.

Fusion of camera and inertial motion capturing unit for entire human-robot-interaction

: Poshtiri, Azam Haghighi; Bdiwi, Mohamad

Chemnitz, 2017, 90 pp.
Chemnitz, TU, Master Thesis, 2017
Master Thesis
Fraunhofer IWU ()
human-robot interaction; posture; gesture; machine learning

Importance of robots in industrial applications is a well-known fact. Generally, robots are placed away from workers in a production line to ensure workers’ safety. But there are many tasks in industries, which can’t be done alone by a robot or alone by a worker. Therefore, there is a need to establish a reliable and safe interaction environment between robots and workers. This thesis focuses on industrial Human-Robot Interaction. For a safe and efficient Human-Robot Interaction, a robot needs to detect worker’s position, posture and gesture. In this work, an Inertial Measurement Unit (IMU e.g. Xsens) and a 3D camera as an image sensor (e.g. Intenta) are used. Only specific postures and gestures, based on industrial experiences and requirements, are selected. These postures and gestures were tested on 12 participants. For each posture and gesture one or multiple joint movements, which can differentiate it from others, are listed and their value are used as parameters to train a posture classifier model and a gesture classifier model. Support Vector Machine classifier is used as a Machine Learning algorithm to train classifiers. Two models for the classifiers are implemented and included into an Interaction-tool, which is developed during the thesis work. Data from any new participant can be easily classified. The Interaction-tool gets data from the image sensor to detect the exact position of a person and data from the IMU to detect the person’s postures and gestures. The posture classifier and the gesture classifier models show accuracies of 94% and 99%, respectively, to predict postures and gesture for new participants with 100 ms of detection time. A sliding window technique is implemented in the Interaction-tool to monitor the output of both the classifier predictors. When an undefined posture/gesture is performed, the sliding window can ignore it and hence increase accuracy. In future, classifier models can be trained and updated to detect more gestures and postures. The Interaction-tool is successfully tested on a KUKA KR 180 R2900 prime industrial robot arm. This thesis work is a step towards the establishment of a reliable and safe interaction environment between robots and human so that even a non-skilled worker can easily interact with a robot.