3D scene reconstruction from IR image sequences for image-based navigation update and target detection of an autonomous airborne system
The successful mission of an autonomous airborne system like an unmanned aerial vehicle (UAV) strongly depends on an accurate target approach as well as the real time acquisition of detailed knowledge about the target area. An automatic 3D scene reconstruction of the overflown ground by a structure from motion system enables to interpret the scenario and to react on possible changes by optimization of flight path or adjustment of mission objectives. Additionally, detection of the target itself can be improved due to the analysis of the reconstructed 3D target scenario. In this work a newly developed system for automatic 3D reconstruction of a scene from aerial infrared imagery is presented. For more accuracy in the reconstruction and to overcome the drawbacks of feature tracking in IR images, pose information given by an IMU (Inertial Measurement Unit) are used for computation of 3D structure. Detected 2D image features are tracked image by image to calculate corresponding 3D information. Each estimated 3D point is assessed by means of its covariance matrix which is associated with the respective uncertainty. Finally a non-linear optimization (Gauss-Newton iteration) of the reconstruction result yields the completed 3D point cloud. As possible applications, approaches for target recognition in fused IR images and 3D point clouds as well as registration of point clouds for image based navigation update are presented.