Improving stereo vision based SLAM by integrating inertial measurements for person indoor navigation
Precise indoor localization is of great importance. For instance the efficiency and safety of first responders gets improved significantly. Stereo SLAM algorithms achieve remarkable results, nevertheless there are limits to what a single sensor type system can achieve in case of over-or underexposed images, highly dynamic camera movements and homogeneous environments. This paper proposes a straight forward yet very efficient method of integrating Inertial Measurement Unit (IMU) data to overcome those problems. The camera movement between consecutive frames can be seen as an optimization problem. Whereas existing algorithms obtain the initial guess of the camera by assuming constant velocity, this work exploits IMU data to predict the movement. Evaluation of the proposed localization system is conducted with a person carried sensor setup on typical indoor scenarios. It is demonstrated, that the accuracy of the initial guess can be enhanced considerably. Improved initial camera poses result in faster convergence and higher success rate of finding the optimum, which results in better overall localization quality. Scenarios, which could not be handled by the stand-alone version of the stereo SLAM are now feasible thanks to IMU data integration.