Fraunhofer-Gesellschaft

Publica

Hier finden Sie wissenschaftliche Publikationen aus den Fraunhofer-Instituten.

6-DoF model-based tracking of arbitrarily shaped 3D objects

 
: Azad, P.; Münch, D.; Asfour, T.; Dillmann, R.

:
Postprint urn:nbn:de:0011-n-1832226 (473 KByte PDF)
MD5 Fingerprint: b2111f3f12979d3fd303472ba163f65a
© 2011 IEEE. Personal use of this material is permitted. However, permission to reprint/republish this material for advertising or promotional purposes or for creating new collective works for resale or redistribution to servers or lists, or to reuse any copyrighted component of this work in other works must be obtained from the IEEE.
Created on: 25.10.2011


Bicchi, A. ; IEEE Robotics and Automation Society:
IEEE International Conference on Robotics and Automation, ICRA 2011. Vol.6 : 09.05.2011 - 13.05.2011, Shanghai, China
New York, NY: IEEE, 2011
ISBN: 978-1-612-84386-5
ISBN: 978-1-612-84385-8
ISBN: 978-1-612-84380-3
pp.5204-5209
International Conference on Robotics and Automation (ICRA) <2011, Shanghai>
English
Conference Paper, Electronic Publication
Fraunhofer IOSB ()

Abstract
Image-based 6-DoF pose estimation of arbitrarily shaped 3D objects based on their shape is a rarely studied problem. Most existing image-based methods for pose estimation either exploit textural information in form of local features or, if shape-based, rely on the extraction of straight line segments or other primitives. Straight-forward extensions of 2D approaches are potentially more general, but in practice assume a limited range of possible view angles. The general problem is that a 3D object can potentially produce completely different 2D projections depending on its relative pose to the observing camera. One way to reduce the solution space is to exploit temporal information, i.e. perform tracking. Again, existing model-based tracking approaches rely on relatively simple object geometries. In this paper, we propose a particle filter based tracking approach that can deal with arbitrary shapes and arbitrary or even no texture, i.e. it offers a general solution to the rigid object tracking problem. As our approach can deal with occlusions, it is in particular of interest in the context of goal-directed imitation learning involving the observation of object manipulations. Results of simulation experiments as well as real-world experiments with different object types prove the practical applicability of our approach.

: http://publica.fraunhofer.de/documents/N-183222.html