Hier finden Sie wissenschaftliche Publikationen aus den Fraunhofer-Instituten.

Validation of EOSAR algorithms for blending of objects in SAR-scenes

Presentation held at ISIP 2015, International Symposium on Indirect Protection, 20th-23th October 2015, Bad Reichenhall
EOSAR - Embedding Objects in SAR-Scenes
: Hough, Luke; Maresch, Anika; Stanko, Stephan; Frioud, Max; Wellig, Peter

Präsentation urn:nbn:de:0011-n-4132688 (786 KByte PDF)
MD5 Fingerprint: eca628eefaea3b5b6ab272089ebedce9
Erstellt am: 30.9.2016

2015, 11 Folien
International Symposium on Indirect Protection (ISIP) <2015, Bad Reichenhall>
Vortrag, Elektronische Publikation
Fraunhofer FHR ()

The EOSAR software is a joint project of Fraunhofer FHR and the University of Zurich, financed by Wehrtechnische Dienststelle 52, Oberjettenberg, and armasuisse, Thun. The goal of the EOSAR project is to develop the algorithms and tools necessary to inject or blend objects from a variety of sources into existing SAR imagery. By blending objects into existing scenes, more data can be provided for training imagery analysts or recognition algorithms without requiring expensive SAR measurement campaigns. While other injection techniques use image processing techniques, EOSAR injects the objects while maintaining phase information by blending complex signal data. Operating on complex signal data should lead to more realistic results and will provide all information necessary for recognition algorithm applications. The data required for the blending method can be obtained through turntable ISAR measurements or RCS simulation applications. Previous work has shown that the method of blending with complex data is possible; however, the accuracy of the resultant SAR data has not yet been thoroughly evaluated. The current research is focused on identifying and implementing appropriate methods for validating the EOSAR blending algorithm. The validation methods under consideration include quantitative image and signal properties as well as qualitative testing by trained imagery analysts. The proposed validation procedure consists of multiple stages based on the accuracy required to meet each criterion. The first level of characteristics is based on properties of the focused S AR image. These include the orientation, dimensions, and reflectivity of the blended object and simulated shadow area relative to the same object in the genuine scene. At the second level, complex signal characteristics such as the filtered bandwidth of the blended object relative to the scene and phase irregularities at the boundary will be evaluated. These properties are dependent on the signal processing operations applied to each data source and may not be easily observable in the focused image. The final stage will incorporate both evaluation with change detection algorithms and qualitative analysis by a trained human observer. The qualitative evaluation by an imagery analyst will provide a measure of the EOSAR success rate in deceiving the observer. Successful deception of the trained observer will indicate that the results of the blending algorithm are acceptable for training other analysts. The highest level of accuracy will be evaluated by the change detection algorithms. To achieve this level of accuracy, the EOSAR tools should be capable of extracting a genuine target from a scene and injecting the same object from a different data source without appearing as an alteration to the change detection algorithm.