Hier finden Sie wissenschaftliche Publikationen aus den Fraunhofer-Instituten.

Pilot study on real-time motion detection in UAS video data by human observer and image exploitation algorithm

: Hild, Jutta; Krüger, Wolfgang; Brüstle, Stefan; Trantelle, Patrick; Unmüßig, Gabriel; Voit, Michael; Heinze, Norbert; Peinsipp-Byma, Elisabeth; Beyerer, Jürgen

Fulltext urn:nbn:de:0011-n-4590523 (353 KByte PDF)
MD5 Fingerprint: 55895bc7c931561830ec8b0918871e00
Copyright Society of Photo-Optical Instrumentation Engineers. One print or electronic copy may be made for personal use only. Systematic reproduction and distribution, duplication of any material in this paper for a fee or for commercial purposes, or modification of the content of the paper are prohibited.
Created on: 27.3.2018

Palaniappan, Kannappan (Ed.) ; Society of Photo-Optical Instrumentation Engineers -SPIE-, Bellingham/Wash.:
Geospatial Informatics, Fusion, and Motion Video Analytics VII : 12 April 2017, Anaheim, California, United States
Bellingham, WA: SPIE, 2017 (Proceedings of SPIE 10199)
ISBN: 9781510608993
Paper 1019903, 9 pp.
Conference "Geospatial Informatics, Fusion, and Motion Video Analytics" <7, 2017, Anaheim/Calif.>
Conference Paper, Electronic Publication
Fraunhofer IOSB ()
real-time motion video analysis; human observer; independent motion detection; image exploitation system; computer-human interaction; gaze-based interaction; pilot study

Real-time motion video analysis is a challenging and exhausting task for the human observer, particularly in safety and security critical domains. Hence, customized video analysis systems providing functions for the analysis of subtasks like motion detection or target tracking are welcome. While such automated algorithms relieve the human operators from performing basic subtasks, they impose additional interaction duties on them. Prior work shows that, e.g., for interaction with target tracking algorithms, a gaze-enhanced user interface is beneficial. In this contribution, we present an investigation on interaction with an independent motion detection (IDM) algorithm. Besides identifying an appropriate interaction technique for the user interface – again, we compare gaze-based and traditional mouse-based interaction – we focus on the benefit an IDM algorithm might provide for an UAS video analyst. In a pilot study, we exposed ten subjects to the task of moving target detection in UAS video data twice, once performing with automatic support, once performing without it. We compare the two conditions considering performance in terms of effectiveness (correct target selections). Additionally, we report perceived workload (measured using the NASA-TLX questionnaire) and user satisfaction (measured using the ISO 9241-411 questionnaire). The results show that a combination of gaze input and automated IDM algorithm provides valuable support for the human observer, increasing the number of correct target selections up to 62% and reducing workload at the same time.