Detection and infected area segmentation of apple fire blight using image processing and deep transfer learning for site-specific management
Introduction
Apple is one of the most valuable crops in the United States, with an annual value of approximately $2.94 billion (USDA-NASS, 2021). Fire blight is a devastating bacterial disease caused by Erwinia amylovora that can seriously damage apple trees. It infects shoots, rootstocks, flowers, and fruits, causing structural destruction, floral death, and even tree death (Norelli et al., 2003). Infections are sporadic and range in severity depending on cultivar susceptibility and environmental conditions (Norelli et al., 2003, Peil et al., 2009). Nearly all commercial apple cultivars, including Fuji, Gala, Golden Delicious, and Cripps Pink, are susceptible to this disease (Gianessi et al., 2002). Norelli et al. (2003) described a severe fire blight outbreak in southwest Michigan where a total of 220,000 trees aged two to five years were killed, and >600 acres of orchards were lost, leading to an economic loss of over $42 million. This disease causes more than $100 million economic loss every year in the United States (Busdieker-Jesse et al., 2016). It is crucial to effectively detect symptoms and mitigate fire blight disease to avoid a subsequent spread of the disease and reduce economic loss.
The symptoms of fire blight appear when temperatures rise above 65° F with rain, heavy dew, and high humidity. The symptoms include the browning of buds, young shoots, leaves, and flowers (Gaganidze et al., 2018). Farmers usually walk or drive through orchards to visually observe fire blight symptoms/infections on trees. Management operations such as pruning can reduce the severity of fire blight by removing infected portions of trees to reduce disease spread and tree loss. The type of management operation depends on the severity or amount of area infected in the tree. However, manual monitoring of an orchard is time-consuming, and human eyes can easily overlook infected twigs in a tree canopy and cannot accurately quantify the infected areas. Therefore, it is necessary to develop an accurate and rapid fire blight disease detection and infected area segmentation technique to assist the tree fruit or apple industry in timely and site-specific disease management.
The evolution of UAVs, which have low manufacturing costs, opened up further research opportunities. The UAVs are becoming an essential part of remote sensing tools, bringing new opportunities in the precision farming context, especially for in-field crop disease monitoring and management (Su et al., 2019). The applications of UAVs in precision farming range from sensing, planting, mapping, and spraying to irrigation scheduling (Romero et al., 2018, Kim et al., 2019). The advantage of using UAVs lies in its flexibility in acquiring ultra-high spatial and temporal resolution data at any time under suitable conditions (Ye et al., 2020a), while data collection using ground-based systems is time-consuming and logistically challenging in remote regions. Sensors can be mounted on UAVs to acquire crop data, process data onboard, and perform management tasks based on processed data. Research progress using UAVs has shown promising potential for detecting powdery mildew on squash (Abdulridha et al., 2020), Huanglongbing on citrus (Deng et al., 2020), and flavescence dorée on grapevines (Musci et al., 2020). Studies have also shown the importance of attaching the multispectral camera sensor to the UAV for crop disease detection (Kerkech et al., 2020, Ye et al., 2020b). UAV based disease detection has advantage of rapid data collection, while limitation may occur when some of the disease infestations locate at the side or bottom of the tree canopies (Fig. 1), and the resolution may not be sufficient to quantify the infection areas. Pruning out the infected canopies is the only way to treat fire blight disease and minimize spreading at this stage. Therefore, after UAV scouting to identify the potential infection areas, it is important to have a ground-based camera vision system to verify the fire blight infections in the identified unhealthy areas and segment the area of infections.
Deep learning (DL) technique is an emerging artificial intelligence approach that allows accurate and rapid detection of objects by automatically learning and understanding low-level to high-level image features in terms of a hierarchy of concepts (LeCun et al., 2015). Recent developments of DL algorithms have been proven effective in various crop management operations, including pest recognition (Li et al., 2020) and disease detection (Joshi et al., 2021). DL algorithms have an extraordinary ability to accurately detect complex crop diseases and stress with less time due to the more complex models being used that allow for massive parallelization (Ghosal et al., 2018). The DL-based image processing techniques of object detection, semantic segmentation, and instance segmentation are mainly used for crop disease detection and identification. The object detection technique identifies the boundary of disease present in captured images or videos. The semantic segmentation approach labels each pixel in the image with different colors based on their category classes. Instance segmentation also helps identify the boundaries of the diseases and label their pixel with different colors, which is considered as a high precision technique for crop disease detection (Su et al., 2021). Mask R-CNN (Region-Convolutional Neural Network) proposed by Kaiming et al. (2018) has detected and segmented diseased areas into a single framework. Considerable studies have demonstrated that Mask R-CNN algorithm has high detection and segmentation accuracies (Jia et al., 2020, Su et al., 2021, Yang et al., 2020). Mask R-CNN model uses different CNN architectures as a backbone with many parameters that need to be trained. Training these CNN backbones from scratch requires a large labeled-image dataset and substantial computer resources to obtain high performance. Obtaining a large labeled-image dataset is a difficult task and not always possible. Deep transfer learning uses a pre-trained network where only the parameters of the last classification labels need to be inferred from scratch, which alleviates the problem of classical DL models (Kessentini et al., 2019) so it is suitable for applications with a limited dataset. Hence, Mask R-CNN with a pretrained network (deep transfer learning approach) can be useful for automatic fire blight disease detection and infected area segmentation of apple trees.
A few attempts have been conducted to detect fire blight disease in pear trees (Bagheri et al., 2018, Bagheri, 2020) and apple leaves (Skoneczny et al., 2020). Bagheri et al. (2018) conducted a laboratory-based analysis where the individual sample of the leaf was collected and analyzed separately, although leaves actually present in the form of a complex cluster in real orchard conditions. Bagheri (2020) used different indices and a traditional machine learning model (support vector machine) for fire blight disease detection in pear, where image features were selectively extracted. The feature selection methods did not provide higher accuracy than advanced DL algorithms. Skoneczny et al. (2020) harvested individual fire blight infected apple leaves for laboratory-based detection, which was under different conditions than in field. Moreover, none of the studies was able to segment the infected area accurately, which is needed for performing the site-specific management operation.
The overall objective of this study was to detect and segment area of infections of fire blight disease in a complex apple orchard environment for infected canopy removal. To confirm the fire blight inspection, we proposed a hybrid two modality (UAV and Ground) system for this project with an aim of detecting and evaluating the infection for site-specific management. The specific objectives were to 1) classify unhealthy trees canopies using a UAV-based image processing technique to identify possible locations of disease infection from overhead and 2) detect and segment fire blight disease from ground-captured apple trees images using the deep transfer learning-based Mask R-CNN model from the identified unhealthy tree canopies.
Section snippets
Image acquisition
An orchard with Gala apple variety, located at Penn State Fruit Research and Extension Center, Biglerville, PA, USA, was used for the experiment. Trees were trained in a conventional central leader tree architecture. The trees were planted in 2006 with an inter-row spacing of 5.0 m and intra-plant spacing of 3.0 m. The average tree height was about 3.05 with dense canopies. The images, each containing distinct visual symptoms of fire blight disease on leaves and young shoots, were captured
Vegetation indices and unhealthy tree canopy classification
The indices were designed to assess vegetation greenness in which the disease infection changes the amount of greenness in the tree canopies. The relationships between vegetation indices and fire blight infection were unique, especially with the indices calculated using the blue channel. Six indices were calculated from the healthy and infected canopies (Fig. 10). The index value of NDVI, RENDVI, GNDVI and TBI ranged between 0 and 1. The value of the ExB index ranged from 0 to 2 because the
Conclusions
This paper presented both UAV and ground-based imaging approaches for image feature analysis and disease detection. The UAV-based imaging system provided a quick method to acquire large amounts of data from the orchard; however, lack of fire blight infection at the top of canopies offered fewer images with infection, resulting in insufficient data for developing a prediction model. Therefore, the UAV-based images were only used for feature analysis. The image feature analysis indicated that
Declaration of Competing Interest
The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.
Acknowledgements
This study was supported in part by the United States Department of Agriculture (USDA)'s National Institute of Food and Agriculture (NIFA) Federal Appropriations under Project PEN04653 and Accession No. 1016510, a USDA NIFA Crop Protection and Pest Management Program (CPPM) competitive grant (Award No. 2019-70006-30440), a Northeast Sustainable Agriculture Research and Education (SARE) Graduate Student Grant GNE20-234-34268, and Penn State College of Agriculture Sciences (CAS) Graduate Student
References (51)
- et al.
Detecting powdery mildew disease in squash at different stages using UAV-based hyperspectral imaging and artificial intelligence
Biosyst. Eng.
(2020) Application of aerial remote sensing technology for detection of fire blight infected pear trees
Comput. Electron. Agric.
(2020)- et al.
Apple powdery mildew infestation detection and mapping using high-resolution visible and multispectral aerial imaging technique
Sci. Hortic.
(2021) - et al.
Fire blight in Georgia
Ann. Agrar. Sci.
(2018) - et al.
Use of a green channel in remote sensing of global vegetation from EOS-MODIS
Remote Sens. Environ.
(1996) - et al.
Novel algorithms for remote estimation of vegetation fraction
Remote Sens. Environ.
(2002) - et al.
Detection and segmentation of overlapped fruits based on optimized mask R-CNN application in apple harvesting robot
Comput. Electron. Agric.
(2020) - et al.
VirLeafNet: automatic analysis and viral disease diagnosis using deep-learning in Vigna mungo plant
Eco. Inform.
(2021) - et al.
Vine disease detection in UAV multispectral images using optimized image registration and deep learning segmentation approach
Comput. Electron. Agric.
(2020) - et al.
A two-stage deep neural network for multi-norm license plate detection and recognition
Expert Syst. Appl.
(2019)
Crop pest recognition in natural scenes using convolutional neural networks
Comput. Electron. Agric.
Vineyard water status estimation using multispectral imagery from an UAV platform and machine learning algorithms for irrigation scheduling management
Comput. Electron. Agric.
Performance analysis of deep learning CNN models for disease detection in plants using image segmentation
Inf. Process. Agric.
Spatio-temporal monitoring of wheat yellow rust using UAV multispectral imagery
Comput. Electron. Agric.
Red and photographic infrared linear combinations for monitoring vegetation
Remote Sens. Environ.
Contrast limited adaptive histogram equalization
Graphics gems
An instance segmentation model for strawberry diseases based on Mask R-CNN
Sensors
Detection of fire blight disease in pear trees by hyperspectral data
Eur. J. Remote Sen.
Deep learning with unsupervised data labeling for weed detection in line crops in UAV images
Remote Sens. (Basel)
The economic impact of new technology adoption on the US apple industry
Agric. Resour. Econ. Rev.
Detection of citrus huanglongbing based on multi-input neural network model of UAV hyperspectral remote sensing
Remote Sens. (Basel)
Cited by (3)
Real-time forest fire detection, monitoring, and alert system using Arduino
2024, Indonesian Journal of Electrical Engineering and Computer ScienceDetection and Segmentation of Anthracnose Leaf Spots in Flowering Dogwood Using Deep Learning for Site-Specific Management
2023, 2023 ASABE Annual International Meeting