Maier, GeorgGeorgMaierShevchyk, AnjaAnjaShevchykFlitter, MerleMerleFlitterGruna, RobinRobinGrunaLängle, ThomasThomasLängleHanebeck, Uwe D.Uwe D.HanebeckBeyerer, JürgenJürgenBeyerer2022-03-062022-03-062021https://publica.fraunhofer.de/handle/publica/26747410.1016/j.compag.2021.106147Automatic quality control has long been an integral part of the processing of food and agricultural products. Visual inspection offers solutions for many issues in this context and can be employed in the form of sensor-based sorting to automatically remove foreign and low quality entities from a product stream. However, these methods are limited to defects that can be made visible by the employed sensor, which usually restricts the system to defects appearing on the surface. An alternative non-visual solution lies in impact-acoustic methods, which do not suffer from this constraint. However, these are strongly limited in terms of material throughput and consequently not suitable for large scale industrial application. In this paper, we present a novel approach that performs inspection based on optically acquired motion data. A high-speed camera captures image sequences of test objects during a transportation process on a chute with a specific structured surface. The trajectory data is then used to classify test objects based on their motion behavior. The approach is evaluated experimentally on the example of distinguishing defect-free hazelnuts from ones that suffer from insect damage. Results show that by merely utilizing the motion data, a recognition rate of up to for undamaged hazelnuts can be achieved. A major advantage of our approach is that it can be integrated in sensor-based sorting systems and is suitable for high throughput applications.enobject trajectorymotion classificationsensor-based sortingimpact-acoustic004630670Motion-based visual inspection of optically indiscernible defects on the example of hazelnutsjournal article