Detection and infected area segmentation of apple fire blight using image processing and deep transfer learning for site-specific management

https://doi.org/10.1016/j.compag.2023.107862Get rights and content

Highlights

  • A Mask R-CNN based fire blight detection and segmentation system was introduced.

  • ResNet-50 and ResNet-101 backbone were used for feature extraction.

  • The transfer learning approach was used to reduce data requirement.

  • Vegetation indices were calculated to analyze the individual index sensitivity to apple fire blight.

  • Results suggest that the system can detect fire blight in complex orchard conditions.

Abstract

Advanced sensing technologies and deep learning models are needed for automatic recognition of pathogens to protect trees in orchards. This study developed a fire blight disease detection and infected area segmentation system using image processing and deep learning approaches to automate the detection process in a complex apple orchard environment for site-specific management. Two types of images were acquired: multispectral images from an unmanned aerial vehicle (UAV) using a multispectral camera and red–greenblue (RGB) images from the ground using two different cameras. Multispectral images were preprocessed and used for image feature analysis by calculating vegetation indices, including excessive blue (ExB), normalized difference vegetation index (NDVI), green normalized difference vegetation index (GNDVI), red-edge normalized difference vegetation index (RENDVI), modified ratio vegetation index (RVI), and triangular blueness index (TBI). Vegetation indices were calculated from a total of 60 multispectral images (30 heathy and 30 fire blight infected). Results showed that RVI was most sensitive to fire blight infection among the six indices. A support vector machine model was used to classify unhealthy tree canopies. A Mask Region-Convolutional Neural Network (Mask R-CNN) based deep learning model was developed from RGB infected images. A total of 880 images were used for training, and 220 images were used for validation. Another 110 images were used for testing the trained Mask R-CNN model. A precision of 92.8 % and recall of 91.2 % were obtained by detecting the infected canopies using a ResNet-101 backbone and intersection over union (IoU) threshold of 0.7. The high precision demonstrates the effectiveness of Mask R-CNN for the identification and segmentation of fire blight infection in images taken in complex orchard conditions. These results prove the potential of this non-invasive sensing method in detecting disease in commercial fruit production for site-specific infected canopies removing.

Introduction

Apple is one of the most valuable crops in the United States, with an annual value of approximately $2.94 billion (USDA-NASS, 2021). Fire blight is a devastating bacterial disease caused by Erwinia amylovora that can seriously damage apple trees. It infects shoots, rootstocks, flowers, and fruits, causing structural destruction, floral death, and even tree death (Norelli et al., 2003). Infections are sporadic and range in severity depending on cultivar susceptibility and environmental conditions (Norelli et al., 2003, Peil et al., 2009). Nearly all commercial apple cultivars, including Fuji, Gala, Golden Delicious, and Cripps Pink, are susceptible to this disease (Gianessi et al., 2002). Norelli et al. (2003) described a severe fire blight outbreak in southwest Michigan where a total of 220,000 trees aged two to five years were killed, and >600 acres of orchards were lost, leading to an economic loss of over $42 million. This disease causes more than $100 million economic loss every year in the United States (Busdieker-Jesse et al., 2016). It is crucial to effectively detect symptoms and mitigate fire blight disease to avoid a subsequent spread of the disease and reduce economic loss.

The symptoms of fire blight appear when temperatures rise above 65° F with rain, heavy dew, and high humidity. The symptoms include the browning of buds, young shoots, leaves, and flowers (Gaganidze et al., 2018). Farmers usually walk or drive through orchards to visually observe fire blight symptoms/infections on trees. Management operations such as pruning can reduce the severity of fire blight by removing infected portions of trees to reduce disease spread and tree loss. The type of management operation depends on the severity or amount of area infected in the tree. However, manual monitoring of an orchard is time-consuming, and human eyes can easily overlook infected twigs in a tree canopy and cannot accurately quantify the infected areas. Therefore, it is necessary to develop an accurate and rapid fire blight disease detection and infected area segmentation technique to assist the tree fruit or apple industry in timely and site-specific disease management.

The evolution of UAVs, which have low manufacturing costs, opened up further research opportunities. The UAVs are becoming an essential part of remote sensing tools, bringing new opportunities in the precision farming context, especially for in-field crop disease monitoring and management (Su et al., 2019). The applications of UAVs in precision farming range from sensing, planting, mapping, and spraying to irrigation scheduling (Romero et al., 2018, Kim et al., 2019). The advantage of using UAVs lies in its flexibility in acquiring ultra-high spatial and temporal resolution data at any time under suitable conditions (Ye et al., 2020a), while data collection using ground-based systems is time-consuming and logistically challenging in remote regions. Sensors can be mounted on UAVs to acquire crop data, process data onboard, and perform management tasks based on processed data. Research progress using UAVs has shown promising potential for detecting powdery mildew on squash (Abdulridha et al., 2020), Huanglongbing on citrus (Deng et al., 2020), and flavescence dorée on grapevines (Musci et al., 2020). Studies have also shown the importance of attaching the multispectral camera sensor to the UAV for crop disease detection (Kerkech et al., 2020, Ye et al., 2020b). UAV based disease detection has advantage of rapid data collection, while limitation may occur when some of the disease infestations locate at the side or bottom of the tree canopies (Fig. 1), and the resolution may not be sufficient to quantify the infection areas. Pruning out the infected canopies is the only way to treat fire blight disease and minimize spreading at this stage. Therefore, after UAV scouting to identify the potential infection areas, it is important to have a ground-based camera vision system to verify the fire blight infections in the identified unhealthy areas and segment the area of infections.

Deep learning (DL) technique is an emerging artificial intelligence approach that allows accurate and rapid detection of objects by automatically learning and understanding low-level to high-level image features in terms of a hierarchy of concepts (LeCun et al., 2015). Recent developments of DL algorithms have been proven effective in various crop management operations, including pest recognition (Li et al., 2020) and disease detection (Joshi et al., 2021). DL algorithms have an extraordinary ability to accurately detect complex crop diseases and stress with less time due to the more complex models being used that allow for massive parallelization (Ghosal et al., 2018). The DL-based image processing techniques of object detection, semantic segmentation, and instance segmentation are mainly used for crop disease detection and identification. The object detection technique identifies the boundary of disease present in captured images or videos. The semantic segmentation approach labels each pixel in the image with different colors based on their category classes. Instance segmentation also helps identify the boundaries of the diseases and label their pixel with different colors, which is considered as a high precision technique for crop disease detection (Su et al., 2021). Mask R-CNN (Region-Convolutional Neural Network) proposed by Kaiming et al. (2018) has detected and segmented diseased areas into a single framework. Considerable studies have demonstrated that Mask R-CNN algorithm has high detection and segmentation accuracies (Jia et al., 2020, Su et al., 2021, Yang et al., 2020). Mask R-CNN model uses different CNN architectures as a backbone with many parameters that need to be trained. Training these CNN backbones from scratch requires a large labeled-image dataset and substantial computer resources to obtain high performance. Obtaining a large labeled-image dataset is a difficult task and not always possible. Deep transfer learning uses a pre-trained network where only the parameters of the last classification labels need to be inferred from scratch, which alleviates the problem of classical DL models (Kessentini et al., 2019) so it is suitable for applications with a limited dataset. Hence, Mask R-CNN with a pretrained network (deep transfer learning approach) can be useful for automatic fire blight disease detection and infected area segmentation of apple trees.

A few attempts have been conducted to detect fire blight disease in pear trees (Bagheri et al., 2018, Bagheri, 2020) and apple leaves (Skoneczny et al., 2020). Bagheri et al. (2018) conducted a laboratory-based analysis where the individual sample of the leaf was collected and analyzed separately, although leaves actually present in the form of a complex cluster in real orchard conditions. Bagheri (2020) used different indices and a traditional machine learning model (support vector machine) for fire blight disease detection in pear, where image features were selectively extracted. The feature selection methods did not provide higher accuracy than advanced DL algorithms. Skoneczny et al. (2020) harvested individual fire blight infected apple leaves for laboratory-based detection, which was under different conditions than in field. Moreover, none of the studies was able to segment the infected area accurately, which is needed for performing the site-specific management operation.

The overall objective of this study was to detect and segment area of infections of fire blight disease in a complex apple orchard environment for infected canopy removal. To confirm the fire blight inspection, we proposed a hybrid two modality (UAV and Ground) system for this project with an aim of detecting and evaluating the infection for site-specific management. The specific objectives were to 1) classify unhealthy trees canopies using a UAV-based image processing technique to identify possible locations of disease infection from overhead and 2) detect and segment fire blight disease from ground-captured apple trees images using the deep transfer learning-based Mask R-CNN model from the identified unhealthy tree canopies.

Section snippets

Image acquisition

An orchard with Gala apple variety, located at Penn State Fruit Research and Extension Center, Biglerville, PA, USA, was used for the experiment. Trees were trained in a conventional central leader tree architecture. The trees were planted in 2006 with an inter-row spacing of 5.0 m and intra-plant spacing of 3.0 m. The average tree height was about 3.05 with dense canopies. The images, each containing distinct visual symptoms of fire blight disease on leaves and young shoots, were captured

Vegetation indices and unhealthy tree canopy classification

The indices were designed to assess vegetation greenness in which the disease infection changes the amount of greenness in the tree canopies. The relationships between vegetation indices and fire blight infection were unique, especially with the indices calculated using the blue channel. Six indices were calculated from the healthy and infected canopies (Fig. 10). The index value of NDVI, RENDVI, GNDVI and TBI ranged between 0 and 1. The value of the ExB index ranged from 0 to 2 because the

Conclusions

This paper presented both UAV and ground-based imaging approaches for image feature analysis and disease detection. The UAV-based imaging system provided a quick method to acquire large amounts of data from the orchard; however, lack of fire blight infection at the top of canopies offered fewer images with infection, resulting in insufficient data for developing a prediction model. Therefore, the UAV-based images were only used for feature analysis. The image feature analysis indicated that

Declaration of Competing Interest

The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.

Acknowledgements

This study was supported in part by the United States Department of Agriculture (USDA)'s National Institute of Food and Agriculture (NIFA) Federal Appropriations under Project PEN04653 and Accession No. 1016510, a USDA NIFA Crop Protection and Pest Management Program (CPPM) competitive grant (Award No. 2019-70006-30440), a Northeast Sustainable Agriculture Research and Education (SARE) Graduate Student Grant GNE20-234-34268, and Penn State College of Agriculture Sciences (CAS) Graduate Student

References (51)

  • Y. Li et al.

    Crop pest recognition in natural scenes using convolutional neural networks

    Comput. Electron. Agric.

    (2020)
  • M. Romero et al.

    Vineyard water status estimation using multispectral imagery from an UAV platform and machine learning algorithms for irrigation scheduling management

    Comput. Electron. Agric.

    (2018)
  • P. Sharma et al.

    Performance analysis of deep learning CNN models for disease detection in plants using image segmentation

    Inf. Process. Agric.

    (2020)
  • J. Su et al.

    Spatio-temporal monitoring of wheat yellow rust using UAV multispectral imagery

    Comput. Electron. Agric.

    (2019)
  • C.J. Tucker

    Red and photographic infrared linear combinations for monitoring vegetation

    Remote Sens. Environ.

    (1979)
  • K. Zuiderveld

    Contrast limited adaptive histogram equalization

    Graphics gems

    (1994)
  • Abadi, M., Agarwal, A., Barham, P., Brevdo, E., Chen, Z., Citro, C., Zheng, X., 2016. Tensorflow: Large-scale machine...
  • Abdulla, W., 2017. Mask r-cnn for object detection and instance segmentation on keras and tensorflow....
  • U. Afzaal et al.

    An instance segmentation model for strawberry diseases based on Mask R-CNN

    Sensors

    (2021)
  • Alshammari, H., Gasmi, K., Ben Ltaifa, I., Krichen, M., Ben Ammar, L., Mahmood, M.A., 2022. Olive disease...
  • N. Bagheri et al.

    Detection of fire blight disease in pear trees by hyperspectral data

    Eur. J. Remote Sen.

    (2018)
  • M.D. Bah et al.

    Deep learning with unsupervised data labeling for weed detection in line crops in UAV images

    Remote Sens. (Basel)

    (2018)
  • N.L. Busdieker-Jesse et al.

    The economic impact of new technology adoption on the US apple industry

    Agric. Resour. Econ. Rev.

    (2016)
  • Chollet, F., 2015. Keras: Deep learning library for theano and tensorflow. URL: https://keras.io/k, 7(8),...
  • X. Deng et al.

    Detection of citrus huanglongbing based on multi-input neural network model of UAV hyperspectral remote sensing

    Remote Sens. (Basel)

    (2020)
  • View full text