Sie befinden Sich nicht im Netzwerk der Universität Paderborn. Der Zugriff auf elektronische Ressourcen ist gegebenenfalls nur via VPN oder Shibboleth (DFN-AAI) möglich. mehr Informationen...
Ergebnis 8 von 77

Details

Autor(en) / Beteiligte
Titel
Transfer learning between crop types for semantic segmentation of crops versus weeds in precision agriculture
Ist Teil von
  • Journal of field robotics, 2020-01, Vol.37 (1), p.7-19
Ort / Verlag
Hoboken: Wiley Subscription Services, Inc
Erscheinungsjahr
2020
Quelle
Wiley Online Library
Beschreibungen/Notizen
  • Agricultural robots rely on semantic segmentation for distinguishing between crops and weeds to perform selective treatments and increase yield and crop health while reducing the amount of chemicals used. Deep‐learning approaches have recently achieved both excellent classification performance and real‐time execution. However, these techniques also rely on a large amount of training data, requiring a substantial labeling effort, both of which are scarce in precision agriculture. Additional design efforts are required to achieve commercially viable performance levels under varying environmental conditions and crop growth stages. In this paper, we explore the role of knowledge transfer between deep‐learning‐based classifiers for different crop types, with the goal of reducing the retraining time and labeling efforts required for a new crop. We examine the classification performance on three datasets with different crop types and containing a variety of weeds and compare the performance and retraining efforts required when using data labeled at pixel level with partially labeled data obtained through a less time‐consuming procedure of annotating the segmentation output. We show that transfer learning between different crop types is possible and reduces training times for up to 80%. Furthermore, we show that even when the data used for retraining are imperfectly annotated, the classification performance is within 2% of that of networks trained with laboriously annotated pixel‐precision data.

Weiterführende Literatur

Empfehlungen zum selben Thema automatisch vorgeschlagen von bX