Mid-air hand interaction with optical tracking for 3D modelling
Compared to common 2D interaction done with mouse and other 2D-tracking devices, 3D hand tracking with low-cost optical cameras can provide more degrees of freedom, as well as natural gestures when shape modeling and assembling are done in virtual spaces. However, though quite precise, the optical tracking devices cannot avoid problems intrinsic to hand interaction, such as hand tremor and jump release, and they also introduce an additional problem of occlusion. This thesis investigates whether interaction during 3D modeling can be improved by using optical sensors so that 3D tasks can be performed in a way similar to interaction in real life and as efficient as when using common 2D-tracking based interaction while still minimizing the intrinsic problems of precise hand manipulations and optical problems. After surveying the relevant works and analyzing technical capabilities of the commonly available optical sensors, two approaches are thoroughly investigated for the natural mid-air hand interaction in precise 3D modeling - they are collision-based and gesture-based interaction. For collision-based methods, a set of virtual interaction techniques is proposed to realistically simulate real-life manipulation and deformation with one and two hands. For gesture-based interaction, a core set of interaction techniques is also devised which allows natural real-life interaction ways to be used. In addition, algorithms are proposed for both collision-based and gesture-based interaction to enhance the precision while minimizing the problems of hand tremor and jump release. However, the results show that virtual interaction designed with collision-based methods is always slower than real-life interaction due to missing force feedback. Although common gesture-based interaction is less affected by its problem, it still cannot completely get rid of the problems of occlusion and jump release. Eventually, a new method of gesture-based interaction is proposed to use hands in a way similar to how it is done when playing the Theremin - an electronic musical instrument controlled without physical contact by hands of the performer. It is suggested that the dominant hand controls manipulation and deformation of objects while the non-dominant hand controls grasping, releasing and variable precision of interaction. Based on this method, a generic set of reliable and precise gesture-based interaction techniques is designed and implemented for various manipulation and deformation tasks. It is then proved with the user studies that for the tasks involving 3D manipulations and deformations, the proposed way of hand interaction is easy to learn, not affected by the common problems of hand tracking, as well as more convenient and faster than common 2D interaction done with mouse for some 3D tasks.
Singapore, Univ., Diss., 2019
Fellner, Dieter W.