Hier finden Sie wissenschaftliche Publikationen aus den Fraunhofer-Instituten.

Estimating Efforts and Success of Symmetry-Seeing Machines by Use of Synthetic Data

: Michaelsen, Eckart; Vujasinovic, Stéphane

Fulltext urn:nbn:de:0011-n-5346783 (621 KByte PDF)
MD5 Fingerprint: a8ff00b7029100ac02dae66be84c8e77
(CC) by
Created on: 21.2.2019

Symmetry 11 (2019), No.2, Art. 227, 16 pp.
ISSN: 2073-8994
Journal Article, Electronic Publication
Fraunhofer IOSB ()

Representative input data are a necessary requirement for the assessment of machine-vision systems. For symmetry-seeing machines in particular, such imagery should provide symmetries as well as asymmetric clutter. Moreover, there must be reliable ground truth with the data. It should be possible to estimate the recognition performance and the computational efforts by providing different grades of difficulty and complexity. Recent competitions used real imagery labeled by human subjects with appropriate ground truth. The paper at hand proposes to use synthetic data instead. Such data contain symmetry, clutter, and nothing else. This is preferable because interference with other perceptive capabilities, such as object recognition, or prior knowledge, can be avoided. The data are given sparsely, i.e., as sets of primitive objects. However, images can be generated from them, so that the same data can also be fed into machines requiring dense input, such as multilayered perceptrons. Sparse representations are preferred, because the author’s own system requires such data, and in this way, any influence of the primitive extraction method is excluded. The presented format allows hierarchies of symmetries. This is important because hierarchy constitutes a natural and dominant part in symmetry-seeing. The paper reports some experiments using the author’s Gestalt algebra system as symmetry-seeing machine. Additionally included is a comparative test run with the state-of-the-art symmetry-seeing deep learning convolutional perceptron of the PSU. The computational efforts and recognition performance are assessed.