Hier finden Sie wissenschaftliche Publikationen aus den Fraunhofer-Instituten.

Acoustic Ripeness Classification for Watermelon Fruits using Convolutional Neural Networks

CNN for Ripeness Classification of Watermelon Fruits based on Acoustic Testing
: Albert‐Weiss, Dominique; Hajdini, Egla; Heinrich, Matthias; Osman, Ahmad

Fulltext (PDF; )

Maldague, Xavier P.V. (Chair):
3rd International Symposium on Structural Health Monitoring and Nondestructive Testing, SHM-NDT 2020. Online resource : 25-26 November 2020, Quebeq, Canada, virtual
Online im WWW, 2020
10 pp.
International Symposium on Structural Health Monitoring and Nondestructive Testing (SHM-NDT) <3, 2020, Online>
Conference Paper, Electronic Publication
Fraunhofer IZFP ()
Acoustic Resonance Testing (ART); Convolutional Neural Networks; food monitoring

A pivotal topic in food science and monitoring is the assessment of the quality and ripeness of agricultural products by using nondestructive testing techniques. These have specifically been dedicated to determine the biochemical properties by using traditional statistical methods. However, these statistical methods do not provide a sufficient method as they do not reflect the complexity when working with natural products. While deep learning has earned high acknowledgement by surpassing state-of-the-art benchmarks, it has only gained interest in the application of nondestructive testing within the recent years. For this, the rise in popularity of deep learning can largely been drawn back to learning the representation of the data by extracting features within the latent space which cannot be determined in a direct manner. In this paper, we study the change in ripeness and shelf life of watermelon fruits by applying deep learning to acoustic data based on acoustic resonance testing. We describe the architecture of a deep convolutional neural network that classifies between ripe and overripe watermelon fruits. The neural network was trained based on acoustic information of the spectral domain as well as on morphologic and experiment-based features. Our model achieves a classification accuracy of 96%.