Options
2014
Conference Paper
Titel
Articulated 3D model tracking with on-the-fly texturing
Abstract
In this paper, we present a framework for capturing and tracking humans based on RGBD input data. The two contributions of our approach are: (a) a method for robustly and accurately fitting an articulated computer graphics model to captured depth-images and (b) on-the-fly texturing of the geometry based on the sensed RGB data. Such a representation is especially useful in the context of 3D telepresence applications since model-parameter and texture updates require only low bandwidth. Additionally, this rigged model can be controlled through interpretable parameters and allows automatic generation of naturally appearing animations. Our experimental results demonstrate the high quality of this model-based rendering.