Hier finden Sie wissenschaftliche Publikationen aus den Fraunhofer-Instituten.

Discrimination of plants and weed by multi-sensor fusion on an agricultural robot

: Frese, Christian; Meyer, J.; Frey, Christian

Volltext urn:nbn:de:0011-n-2986054 (1.0 MByte PDF)
MD5 Fingerprint: 1440de8799a4239358c70d22e94e1ac7
Erstellt am: 15.07.2015

International Conference of Agricultural Engineering, AgEng 2014. Online resource (Nicht mehr online verfügbar) : Zurich, 6-10 July 2014; Proceedings
Zurich, 2014
8 S.
International Conference of Agricultural Engineering (AgEng) <2014, Zurich>
Bundesministerium für Bildung und Forschung BMBF
01IM12002C; AgriApps
Konferenzbeitrag, Elektronische Publikation
Fraunhofer IOSB ()
robotics; multi-sensor fusion; weed control

Weed control is in many cases a cost-intensive operation, requiring either manual work to mechanically remove the weed or herbicides which may be harmful also to the crop and to the environment in general. Agricultural robotics has the potential to make mechanical weed control more cost-effective, thus allowing a more frequent application. Thereby herbicides become dispensable to a large extent, which is often beneficial for both crop and environment.
This contribution presents an approach to automatically classify crop plants and weed based on multi-sensor information with the aim of mechanically removing the weed by an agricultural robot. First, a possible sensor setup and its calibration are outlined. Then, different features are computed from the sensor data and the discriminative power of the features is evaluated. The feature vector is input to a support vector machine classifier. The result is a 3D representation of plants and weed which can be used for automatic weed removal. The mechanical manipulation system which is developed by our project partners is outside the scope of this paper.
As an example application, tree nurserys are considered, especially the growth of boxwood trees. Boxwood grows very slowly and requires repeated weed control during several years so that the possible savings of automation are notable. Typically, the boxwood plants are arranged in rows. Our outdoor robot IOSB.amp O1 is able to drive along such rows. It observes the plants from above and records sensor data of real boxwood trees.
The current sensor setup consists of a laser scanner and a color camera. The laser scanner measures range values within an inclined plane. As the robot moves forward, the data can be aggregated into a 3D model. To increase model quality, the roll and pitch angles of the robot have to be estimated and compensated.
The sensors are calibrated so that for each 3D point observed by the laser scanner the corresponding pixel in the camera image is known. Thus both intensity and range information can be used to a extract features for classification. Several classes of features and their parameters have been evaluated on real sensor data: range features, color features, and texture features. The most discriminative features are selected for the classification, among them local range variance, normalized color information, Laws\' texture energy features and histograms of oriented gradients. A support vector machine is trained on hand-labeled data. The considered classes are boxwood, weed, and soil. The classification result can be incorporated in the 3D model so that the class of each surface point is known.
The classification of the boxwood plants is very reliable, whereas weed and soil are mixed up sometimes, especially if there are in close vicinity. But this is uncritical as the mechanical manipulator will always treat a region a little larger than the detected weed area.
The resulting data can be used for mechanical weed control as well as for other applications such as cutting the boxwood trees into the desired (spherical) shape or rating the soundness of the plants.