Sugar Beet Damage Detection during Harvesting Using Different Convolutional Neural Network Models

dc.date.accessioned2022-01-05T12:32:10Z
dc.date.available2022-01-05T12:32:10Z
dc.date.issued2021-11-09
dc.description.sponsorshipGefördert durch den Publikationsfonds der Universität Kasselger
dc.identifierdoi:10.17170/kobra-202201045358
dc.identifier.urihttp://hdl.handle.net/123456789/13486
dc.language.isoengeng
dc.relation.doidoi:10.3390/agriculture11111111
dc.rightsNamensnennung 4.0 International*
dc.rights.urihttp://creativecommons.org/licenses/by/4.0/*
dc.subjectconvolutional neural networkeng
dc.subjectdamageeng
dc.subjectdeep learningeng
dc.subjectharvestereng
dc.subjectsugar beeteng
dc.subject.ddc600
dc.subject.ddc630
dc.subject.swdConvolutional Neural Networkger
dc.subject.swdZuckerrübeger
dc.subject.swdSchadenger
dc.subject.swdErntemaschineger
dc.subject.swdDeep learningger
dc.subject.swdFehlererkennungger
dc.titleSugar Beet Damage Detection during Harvesting Using Different Convolutional Neural Network Modelseng
dc.typeAufsatz
dc.type.versionpublishedVersion
dcterms.abstractMechanical damages of sugar beet during harvesting affects the quality of the final products and sugar yield. The mechanical damage of sugar beet is assessed randomly by operators of harvesters and can depend on the subjective opinion and experience of the operator due to the complexity of the harvester machines. Thus, the main aim of this study was to determine whether a digital two-dimensional imaging system coupled with convolutional neural network (CNN) techniques could be utilized to detect visible mechanical damage in sugar beet during harvesting in a harvester machine. In this research, various detector models based on the CNN, including You Only Look Once (YOLO) v4, region-based fully convolutional network (R-FCN) and faster regions with convolutional neural network features (Faster R-CNN) were developed. Sugar beet image data during harvesting from a harvester in different farming conditions were used for training and validation of the proposed models. The experimental results showed that the YOLO v4 CSPDarknet53 method was able to detect damage in sugar beet with better performance (recall, precision and F1-score of about 92, 94 and 93%, respectively) and higher speed (around 29 frames per second) compared to the other developed CNNs. By means of a CNN-based vision system, it was possible to automatically detect sugar beet damage within the sugar beet harvester machine.eng
dcterms.accessRightsopen access
dcterms.creatorNasirahmadi, Abozar
dcterms.creatorWilczek, Ulrike
dcterms.creatorHensel, Oliver
dcterms.source.articlenumber1111
dcterms.source.identifiereissn:2077-0472
dcterms.source.issueIssue 11
dcterms.source.journalAgricultureeng
dcterms.source.volumeVolume 11
kup.iskupfalse

Files

Original bundle

Now showing 1 - 1 of 1
Thumbnail Image
Name:
agriculture_11_01111.pdf
Size:
2.87 MB
Format:
Adobe Portable Document Format
Description:

License bundle

Now showing 1 - 1 of 1
No Thumbnail Available
Name:
license.txt
Size:
3.03 KB
Format:
Item-specific license agreed upon to submission
Description:

Collections