PublicaHier finden Sie wissenschaftliche Publikationen aus den Fraunhofer-Instituten.
Combination of time-of-flight depth and stereo using semiglobal optimization
urn:nbn:de:0011-n-1901013 (1.4 MByte PDF)
MD5 Fingerprint: 7df5f316ebe6ea772606524aac98aeca
© 2011 IEEE. Personal use of this material is permitted. However, permission to reprint/republish this material for advertising or promotional purposes or for creating new collective works for resale or redistribution to servers or lists, or to reuse any copyrighted component of this work in other works must be obtained from the IEEE.
Created on: 13.12.2011
|IEEE Robotics and Automation Society; Shanghai Jiao Tong University:|
IEEE International Conference on Robotics and Automation, ICRA 2011. DVD : Better Robots, Better Life. May 9-13, 2011, Shanghai, China; Workshop Semantic Perception, Mapping and Exploration (SPME)
New York, NY: IEEE, 2011
|International Conference on Robotics and Automation (ICRA) <2011, Shanghai>|
Workshop Semantic Perception, Mapping and Exploration <2011, Shanghai, China>
| Conference Paper, Electronic Publication|
|Fraunhofer IPA ()|
| service robot; time-of-flight; maschinelles Sehen; stereo vision; RGB-D; Roboter|
A growing number of modern computer vision applications like object recognition, collision avoidance and scene understanding demand accurate and dense 3D representations of their environment. To improve existing procedures for 3D data acquisition this paper proposes a novel method for sensor combination on a stereo and a Time-of-Flight camera system. By calibrating the two sensor systems to each other, valid measurements from the 2.5D Time-of-Flight sensor are converted to disparity guesses within the stereo system. The disparity guesses from the Time-of-Flight data constrain the correspondence search results from the stereo matching algorithm. It is empirically shown, that the proposed method effectively enhance the results from stereo vision, especially in structureless areas where stereo correspondence search fails. The method is evaluated on the camera system of the service robot Care-O-bot®3.