A pilot study on gaze-based control of a virtual camera using 360°-Video Data
Over the last decades, gaze input appeared to provide an easy to use and less demanding human-computer interaction method for various applications. It appeared to be particularly beneficial in situations where manual input is either not possible or is challenging and exhausting like interaction with dynamic content in video analysis or computer gaming. In this contribution is investigated whether gaze input could be an appropriate input technique for camera control (panning and tilting) without any manual intervention. The main challenge of such an interaction method is to relieve the human operator from consciously interacting and to let them deploy their perceptive and cognitive resources completely to scene observation. As a first step, a pilot study was conducted operationalizing camera control by navigating in a virtual camera scene, comparing gaze control of the camera with manual mouse control. The experimental task required the 28 subjects (18 expert video analy sts, 10 students and colleagues) to navigate in a 360Â° camera scene in order to keep track of certain target persons. Therefore, an experimental system was implemented providing virtual camera navigation in previously recorded 360fly camera imagery. The results showed that subjects rated gaze control significantly less loading than manual mouse control, using the NASA-TLX questionnaire. Moreover, the large majority preferred gaze control over manual mouse control.