Fraunhofer-Gesellschaft

Publica

Hier finden Sie wissenschaftliche Publikationen aus den Fraunhofer-Instituten.

Robust drone detection for day/night counter-UAV with static VIS and SWIR cameras

 
: Müller, Thomas

:
Volltext urn:nbn:de:0011-n-4590587 (1.3 MByte PDF)
MD5 Fingerprint: 4160087198c67bce2813c0dc6143b474
Copyright Society of Photo-Optical Instrumentation Engineers. One print or electronic copy may be made for personal use only. Systematic reproduction and distribution, duplication of any material in this paper for a fee or for commercial purposes, or modification of the content of the paper are prohibited.
Erstellt am: 27.3.2018


Pham, Tien (Ed.) ; Society of Photo-Optical Instrumentation Engineers -SPIE-, Bellingham/Wash.:
Ground/Air Multisensor Interoperability, Integration, and Networking for Persistent ISR VIII : 10-13 April 2017, Anaheim, California, United States
Bellingham, WA: SPIE, 2017 (Proceedings of SPIE 10190)
ISBN: 978-1-5106-0881-8
ISBN: 978-1-5106-0882-5
Paper 1019018, 12 S.
Conference "Ground/Air Multisensor Interoperability, Integration, and Networking for Persistent ISR" <8, 2017, Anaheim/Calif.>
Englisch
Konferenzbeitrag, Elektronische Publikation
Fraunhofer IOSB ()
counter-UAV; unmanned aerial vehicles (UAV); drone surveillance; intruder detection; change detection; SWIR camera

Abstract
Recent progress in the development of unmanned aerial vehicles (UAVs) has led to more and more situations in which drones like quadrocopters or octocopters pose a potential serious thread or could be used as a powerful tool for illegal activities. Therefore, counter-UAV systems are required in a lot of applications to detect approaching drones as early as possible. In this paper, an efficient and robust algorithm is presented for UAV detection using static VIS and SWIR cameras. Whereas VIS cameras with a high resolution enable to detect UAVs in the daytime in further distances, surveillance at night can be performed with a SWIR camera. First, a background estimation and structural adaptive change detection process detects movements and other changes in the observed scene. Afterwards, the local density of changes is computed used for background density learning and to build up the foreground model which are compared in order to finally get the UAV alarm result. The density model is used to filter out noise effects, on the one hand. On the other hand, moving scene parts like moving leaves in the wind or driving cars on a street can easily be learned in order to mask such areas out and suppress false alarms there. This scene learning is done automatically simply by processing without UAVs in order to capture the normal situation. The given results document the performance of the presented approach in VIS and SWIR in different situations.

: http://publica.fraunhofer.de/dokumente/N-459058.html