Options
2015
Conference Paper
Titel
Gesture-based configuration of location information in smart environments with visual feedback
Abstract
The location of objects and devices in a smart environment is a very important piece of information to enable advanced and sophisticated use cases for interaction and for supporting the user in daily activities and emergency situations. To acquire this information, we propose a semi-automatic approach to configure the location, size, and orientation of objects in the environment together with their semantic meaning. This configuration is typically done with graphical user interfaces showing either a list of objects or a representation of objects in form of 2D or 3D virtual representations. However, there is a gap between the real physical world and the abstract virtual representation that needs to be bridged by the user himself. Therefore, we propose a visual feedback directly in the physical world using a robotic laser pointing system.
Author(s)