Options
2024
Conference Paper
Title
Fusion between Event-Based and Line-Scan Camera for Sensor Based Sorting
Abstract
In sensor-based sorting systems, there is usually a time delay between the detection and separation of the material stream. This delay is required for the sensor data to be processed, i.e., to identify the objects that should be ejected. In this blind phase, the material stream continues to move. In most current systems, homogeneous movement for all objects is assumed, and actuation is timed accordingly. However, in many cases, this assumption does not hold true, for example, when unknown, foreign materials are present that have varying density and shapes, leading to inaccurate activation of the separation actuators and in turn lower sorting quality. Minimizing the blind phase by reducing the distance between the sensor and the actor is limited by the processing time of the detection process and may lead to interference between actuation and sensing. In this work, we address these issues by using an event-based camera placed between the sensor and actuator stages to track objects during the blind phase with minimal latency and small temporal increments between tracking steps. In our proposed setup, the event-based camera is used exclusively for tracking, while an RGB line-scan camera is used for classification. We propose and evaluate several approaches to combine the information of the two cameras. We benchmark our approach against the traditional method of using a fixed temporal offset by comparing simulated valve activation. Our method shows a drastic improvement in accuracy for our example application, improving the percentage of correctly deflected objects to 99.2% compared to 78.57% without tracking.
Author(s)