Fraunhofer-Gesellschaft

Publica

Hier finden Sie wissenschaftliche Publikationen aus den Fraunhofer-Instituten.

Visual words for automated visual inspection of bulk materials

 
: Richter, M.; Längle, Thomas; Beyerer, Jürgen

:
Volltext urn:nbn:de:0011-n-3562308 (282 KByte PDF)
MD5 Fingerprint: ba27aa6781a15f859a7c4821937ac561
Erstellt am: 5.5.2016


International Association for Pattern Recognition -IAPR-; Institute of Electrical and Electronics Engineers -IEEE-:
Fourteenth IAPR International Conference on Machine Vision Applications, MVA 2015. Proceedings : May 18-22, 2015, Tokyo, Japan
Piscataway, NJ: IEEE, 2015
ISBN: 978-4-901122-15-3
S.210-213
International Conference on Machine Vision Applications (MVA) <14, 2015, Tokyo>
Englisch
Konferenzbeitrag, Elektronische Publikation
Fraunhofer IOSB ()

Abstract
The inspection of bulk materials in mining, recycling and food-safety places strong requirements on the speed, accuracy and flexibility of automated visual inspection systems. State of the art methods utilize complex feature descriptors and off-the-shelve machine learning techniques. These methods achieve highly accurate results, but typically suffer in execution speed. Commercial systems, on the other hand, use simple features and classifiers to achieve great processing speed, but pay by a complicated intialization procedure and suboptimal classification accuracy. In this paper, we propose to bridge the gap between the two extremes by learning high level object representations that can be used with simple classifiers. For that, we adapt the well known bag of visual words method to use dense sampling and primitive features. The resulting descriptors are very fast to compute and invariant to scale and rotation. At the same time, the method is virtually parameter-free. This allows non-experts to initializate and operate sorting systems based on this approach. We evaluate our method on three food inspection applications. In all experiments we achieve highly accurate, sometimes nearly perfect classification. Comparison to a state of the art method shows that our approach is superior, beating it by a large margin.

: http://publica.fraunhofer.de/dokumente/N-356230.html