Interaction Optimization through Pen and Touch, Eye Tracking and Speech Control for the Multimodal Seismic Interpretation Workspace
This thesis examines the development of a prototype desktop application for the interpretation of visualized seismic data utilizing pen and touch, eye tracking and speech recognition interaction technologies. Particularly, the combination of the interaction technologies to create a multimodal user interface for seismic interpretation is researched by this thesis, considering current developments in the field of research of multimodal interfaces. Issues during the development are identified and possible solutions are described by this thesis. Based on previous work by Fiedler et al., which investigated a multimodal interface using a Kinect motion sensor and speech control for a room scale application enabling interaction with seismic visualizations, this thesis aims to survey the transportability of the results into a single user workspace (Fiedler et al. 2015). Fiedler et al. observed a significantly improved efficiency for the execution of complex tasks. In a similar approach, the prototype implemented for this research is evaluated by ten participants during a user study, hinting that the results might be dependent on the task and therefore, not transmissible per se. The necessity of further development, more user studies, perhaps over a longer period, and altered goals for further research are detected and presented by this thesis.
Düsseldorf, Hochschule, Master Thesis, 2016