Automatic late blight lesion recognition and severity quantification based on field imagery of diverse potato genotypes by deep learning

Gao, Junfeng, Westergaard, Jesper Cairo, Sundmark, Ea Høegh Riis , Bagge, Merethe, Liljeroth, Erland and Alexandersson, Erik (2021) Automatic late blight lesion recognition and severity quantification based on field imagery of diverse potato genotypes by deep learning. Knowledge-Based Systems, 214 . p. 106723. ISSN 0950-7015

Full content URL: https://doi.org/10.1016/j.knosys.2020.106723

Documents
pii/S0950705120308522

Request a copy
2020.08.27.263186.full_43642.pdf
AAM
[img]
[Download]
[img] PDF
pii/S0950705120308522
Restricted to Repository staff only

95kB
[img]
Preview
PDF
2020.08.27.263186.full_43642.pdf

1MB
Item Type:Article
Item Status:Live Archive

Abstract

The plant pathogen Phytophthora infestans causes the severe disease late blight in potato, which can result in huge yield loss for potato production. Automatic and accurate disease lesion segmentation enables fast evaluation of disease severity and assessment of disease progress. In tasks requiring computer vision, deep learning has recently gained tremendous success for image classification, object detection and semantic segmentation. To test whether we could extract late blight lesions from unstructured field environments based on high-resolution visual field images and deep learning algorithms, we collected∼500 field RGB images in a set of diverse potato genotypes with different disease severity (0%–70%), resulting in 2100 cropped images. 1600 of these cropped images were used as the dataset for training deep neural networks and 250 cropped images were randomly selected as the validation dataset. Finally, the developed model was tested on the remaining 250 cropped images. The results show that the values for intersection over union (IoU) of the classes background (leaf and soil) and disease lesion in the test dataset were 0.996 and 0.386, respectively. Furthermore, we established a linear relationship (R2=0.655) between manual visual scores of late blight and the number of lesions detected by deep learning at the canopy level. We also showed that imbalance weights of lesion and background classes improved segmentation performance, and that fused masks based on the majority voting of the multiple masks enhanced the correlation with the visual disease scores. This study demonstrates the feasibility of using deep learning algorithms for disease lesion segmentation and severity evaluation based on proximal imagery, which could aid breeding for crop resistance in field environments, and also benefit precision farming.

Keywords:plant disease, Resistance breeding, convolutional neural network, semantic segmentation, multi-scale prediction, mask fusion, image-based crop phenotyping
Subjects:G Mathematical and Computer Sciences > G700 Artificial Intelligence
G Mathematical and Computer Sciences > G760 Machine Learning
G Mathematical and Computer Sciences > G740 Computer Vision
C Biological Sciences > C910 Applied Biological Sciences
Divisions:College of Science > Lincoln Institute for Agri-Food Technology
ID Code:43642
Deposited On:22 Feb 2021 15:15

Repository Staff Only: item control page