AI-based Detection and Quantification of Soil Adhesion, Excess Vegetation, Damage and Rot on Sugar Beets

Titel in Übersetzung: AI-basierte Detektion und Quantifizierung von Erdanhang, Vegetation, Beschädigungen und Fäulnis an Zuckerrüben

Publikation: Posterpräsentation ohne Beitrag in TagungsbandPosterpräsentation ohne Eintrag in TagungsbandBegutachtung

9 Downloads (Pure)


As a result of harvesting and natural factors, sugar beets display varying degrees of soil adhesion, excess vegetation, cuts, cracks, and scratches, causing varying degrees of deterioration during storage. To minimize the negative impact on sugar extraction, efficient quality assessment is crucial. We propose an AI-based approach to detect and quantify soil adhesion and excess vegetation on sugar beets using only color images. For this purpose, we present a novel dataset for which we harvested and photographed 1180 sugar beets under varying field and environmental conditions. To precisely locate the beets, soil adhesion, cutting surfaces, damage and rot, each sugar beet is annotated with a semantic mask for two opposite views. Thus, each image pixel is assigned to one of seven classes: background, beet, soil, vegetation, cut, damage, and rot. Additionally, we measure the overall weight and the weight of excess soil and excess vegetation for 50 sugar beets. We conduct comprehensive machine learning experiments by training many combinations of neural networks and encoders for semantic segmentation. The models are trained using a random split of 75% of samples, while 15% are used for validation. The remaining 10% are withheld to measure model performance on unseen image examples. Network performance is measured by determining the average mean intersection over union (mIoU) between inferred and annotated pixel labels. Furthermore, the best performing model is used to measure the soil and vegetation regions of the depicted sugar beets. Area normalization is performed using the known dimensions of a reference object. To obtain weight estimators, multiple polynomial models are fitted to the measured areas and their corresponding measured weights. The results of our experiments show that the best performing neural network is MANet, using a VGG19 encoder and pre-training on ImageNet, resulting in an overall mIoU of 0.71. All described classes are sufficiently learned by the model, except for the rot class, as samples of this class are significantly underrepresented in the dataset. The weight estimations achieve average error margins of 1.8% (beet), 33.5% (soil) and 19.4% (vegetation) and average absolute error margins of 10.3% (beet), 56.3% (soil) and 38.0% (vegetation). This result indicates that for increasing numbers of sugar beets, on average, total weight can be estimated accurately, while our current method for estimating vegetation and soil weight, results in larger error margins. In future work, we aim to improve our method by extending our dataset, implementing a geometric model for sugar beets, and developing hybrid models that simultaneously estimate the segmentation and weights.
Titel in ÜbersetzungAI-basierte Detektion und Quantifizierung von Erdanhang, Vegetation, Beschädigungen und Fäulnis an Zuckerrüben
PublikationsstatusVeröffentlicht - 27 Feb. 2024
Veranstaltung79th IIRB Congress: Innovation: our driver for a profitable and ecologically balanced sugar beet production - Hotel Plaza, Brüssel, Belgien
Dauer: 27 Feb. 202428 Feb. 2024


Konferenz79th IIRB Congress

Research Field

  • Assistive and Autonomous Systems


  • artificial intelligence
  • machine learning
  • precision agriculture
  • sugar beets

Web of Science subject categories (JCR Impact Factors)

  • Agriculture, Multidisciplinary
  • Computer Science, Artificial Intelligence


Untersuchen Sie die Forschungsthemen von „AI-basierte Detektion und Quantifizierung von Erdanhang, Vegetation, Beschädigungen und Fäulnis an Zuckerrüben“. Zusammen bilden sie einen einzigartigen Fingerprint.

Diese Publikation zitieren