Options
2007
Conference Paper
Title
iTACITUS - novel interaction and tracking paradigms for mobile AR
Abstract
iTACITUS is a sixth framework programme project and aims to privide a mobile cultural heritage information system for the individual. By combining itinerary planning, navigation and rich content on-site information, based upon a dispersed repository of historical and cultural resources it enables a complete information system and media experience for historical interested travellers. This paper covers the development of a "Mobile Augmented Reality (AR) Guide Framework" for Cultural Heritage (CH) sites. The framework delivers advanced markerless tracking on mobile computers as well as new interaction paradigms in AR featuring touch and motion capabilities. In addition to visual components like annotated landscapes and superimposed environments the framework will feature a reactive accoustic AR module. The markerless tracking is based on optical flow. Only pure camera rotation is estimated, assuming the user is staying in place. The system is capable of running at real-time frame rates on mobile computers which allows for wide range of applications both indoors and outdoors. The visual AR application gives users the ability to explore the digital information about the site in a very straight way. By holding a mobile computer in front of a point of interest the user immediately gets further information as AR overlay on the screen. This effect appears like looking through the mobile computer's display and seeing the real world enhanced with virtual objects or information. Virtual Objects are reacting on the user's position. Direct interaction with the virtual objects is possible for example by touching the display or shaking the mobile computer. Spatial accoustic AR transports a place's original ambience. While walking through a room the user gets an accoustic impression about how the place has been before, about people and their activities by listening to the conversations and the environment sounds. Due to the 3D position of the sound the user will create himself an spatial image of the situation.
Author(s)