skip to main content
research-article

Deep Learning Embedded into Smart Traps for Fruit Insect Pests Detection

Published: 09 November 2022 Publication History

Abstract

This article presents a novel approach to identify two species of fruit insect pests as part of a network of intelligent traps designed to monitor the population of these insects in a plantation. The proposed approach uses a simple Digital Image Processing technique to detect regions in the image that are likely the monitored pests and an Artificial Neural Network to classify the regions into the right class given their characteristics. This identification is done essentially by a Convolutional Neural Network (CNN), which learns the characteristics of the insects based on their images made from the adhesive floor inside a trap. We have trained several CNN architectures, with different configurations, through a data set of images collected in the field. We aimed to find the model with the highest precision and the lowest time needed for the classification. The best performance in classification was achieved by ResNet18, with a precision of 93.55% and 91.28% for the classification of the pests focused on this study, named Ceratitis capitata and Grapholita molesta, respectively, and a 90.72%overall accuracy. Yet, the classification must be embedded on a resource-constrained system inside the trap, then we exploited SqueezeNet, MobileNet, and MNASNet architectures to achieve a model with lesser inference time and small losses in accuracy when compared to the models we assessed. We also attempted to quantize our highest precision model to reduce even more inference time in embedded systems, which achieved a precision of 88.76% and 89.73% for C. capitata and G. molesta, respectively; notwithstanding, a decrease of roughly 2% on the overall accuracy was endured. According to the expertise of our partner company, our results are worthwhile for a real-world application, since general human laborers have a precision of about 85%.

References

[1]
E. Goldshtein, Y. Cohen, A. Hetzroni, Y. Gazit, D. Timar, L. Rosenfeld, Y. Grinshpon, A. Hoffman, and A. Mizrach. 2017. Development of an automatic monitoring trap for Mediterranean fruit fly (Ceratitis capitata) to optimize control applications frequency. Comput. Electron. Agric. 139 (2017), 115–125.
[2]
Ronan Collobert and Jason Weston. 2008. A unified architecture for natural language processing: Deep neural networks with multitask learning. In Proceedings of the 25th International Conference on Machine Learning. ACM, 160–167.
[3]
Naymã Pinto Dias, Fernando Felisberto da Silva, Jéssica Avila de Abreu, Juliano de Bastos Pazini, and Robson Antonio Botta. 2013. Nível de infestação de moscas-das-frutas em faixa de fronteira, no Rio Grande do Sul. Revista Ceres 60 (Aug. 2013), 589–593. Retrieved from http://www.scielo.br/scielo.php?script=sci_arttext&pid=S0034-737X2013000400020&nrm=iso.
[4]
Weiguang Ding and Graham Taylor. 2016. Automatic moth detection from trap images for pest management. Comput. Electron. Agric. 123 (04 2016), 17–28.
[5]
Marat Dukhan, Marat Dukhan, Yiming Wu, and Hao Lu. 2020. QNNPACK: Open Source Library for Optimized Mobile Deep Learning. Retrieved from https://engineering.fb.com/ml-applications/qnnpack/.
[6]
N. C. Elliott, J. A. Farrell, A. P. Gutierrez, Joop C. van Lenteren, M. P. Walton, Steve Wratten, and D. Dent. 1995. Integrated Pest Management. Springer Science & Business Media, Berlim, Alemanha.
[7]
Fábio Augusto Faria, P. Perre, R. A. Zucchi, L. R. Jorge, T. M. Lewinsohn, Anderson Rocha, and R. da S. Torres. 2014. Automatic identification of fruit flies (Diptera: Tephritidae). J. Visual Commun. Image Represent. 25, 7 (2014), 1516–1527.
[8]
Xavier Glorot, Antoine Bordes, and Yoshua Bengio. 2011. Deep sparse rectifier neural networks. In Proceedings of the Fourteenth International Conference on Artificial Intelligence and Statistics. 315–323.
[9]
Yunchao Gong, Liu Liu, Ming Yang, and Lubomir D. Bourdev. 2014. Compressing deep convolutional networks using vector quantization. Retrieved from http://arxiv.org/abs/1412.6115.
[10]
Ian Goodfellow, Yoshua Bengio, and Aaron Courville. 2016. Deep Learning. MIT Press. Retrieved from http://www.deeplearningbook.org.
[11]
K. He, X. Zhang, S. Ren, and J. Sun. 2016. Deep residual learning for image recognition. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR’16). 770–778.
[12]
Yihui He and Song Han. 2018. AMC: Automated deep compression and acceleration with reinforcement learning. Retrieved from http://arxiv.org/abs/1802.03494.
[13]
Gao Huang, Zhuang Liu, and Kilian Q. Weinberger. 2016. Densely connected convolutional networks. Retrieved from http://arxiv.org/abs/1608.06993.
[14]
Forrest N. Iandola, Matthew W. Moskewicz, Khalid Ashraf, Song Han, William J. Dally, and Kurt Keutzer. 2017. SqueezeNet: AlexNet-level accuracy with \(50\times\) fewer parameters and <1MB model size. Retrieved from https://arxiv.org/abs/1602.07360.
[15]
Sergey Ioffe and Christian Szegedy. 2015. Batch normalization: Accelerating deep network training by reducing internal covariate shift. Retrieved from http://arxiv.org/abs/1502.03167.
[16]
B. Jacob, S. Kligys, B. Chen, M. Zhu, M. Tang, A. Howard, H. Adam, and D. Kalenichenko. 2018. Quantization and training of neural networks for efficient integer-arithmetic-only inference. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition. 2704–2713.
[17]
Joe-Air Jiang, Chwan-Lu Tseng, Fu-Ming Lu, En-Cheng Yang, Zong-Siou Wu, Chia-Pang Chen, Shih-Hsiang Lin, Kuang-Chang Lin, and Chih-Sheng Liao. 2008. A GSM-based remote wireless automatic monitoring system for field information: A case study for ecological monitoring of the oriental fruit fly, Bactrocera dorsalis (Hendel). Comput. Electron. Agric. 62, 2 (2008), 243–259.
[18]
Raghuraman Krishnamoorthi. 2018. Quantizing deep convolutional networks for efficient inference: A whitepaper. Retrieved from http://arxiv.org/abs/1806.08342.
[19]
Alex Krizhevsky, Ilya Sutskever, and Geoffrey E. Hinton. 2012. Imagenet classification with deep convolutional neural networks. In Advances in Neural Information Processing Systems. MIT Press, 1097–1105.
[20]
Ratnesh Kumar, Vincent Martin, and Sabine Moisan. 2010. Robust insect classification applied to real time greenhouse infestation monitoring. In Proceedings of the 20th International Conference on Pattern Recognition on Visual Observation and Analysis of Animal and Insect Behavior Workshop. 1–4.
[21]
Yann LeCun, Bernhard Boser, John S. Denker, Donnie Henderson, Richard E. Howard, Wayne Hubbard, and Lawrence D. Jackel. 1989. Backpropagation applied to handwritten zip code recognition. Neural Comput. 1, 4 (1989), 541–551.
[22]
Min Lin, Qiang Chen, and Shuicheng Yan. 2013. Network in network. Retrieved from https://arxiv.org/abs/1312.4400.
[23]
Jonathan Long, Evan Shelhamer, and Trevor Darrell. 2015. Fully convolutional networks for semantic segmentation. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition. 3431–3440.
[24]
Otoniel López, Miguel Martinez Rach, Hector Migallon, Manuel P. Malumbres, Alberto Bonastre, and Juan J. Serrano. 2012. Monitoring pest insect traps by means of low-power image sensor technologies. Sensors 12, 11 (2012), 15801–15819.
[25]
Maxime Oquab, Leon Bottou, Ivan Laptev, and Josef Sivic. 2014. Learning and transferring mid-level image representations using convolutional neural networks. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition. 1717–1724.
[26]
Omkar M. Parkhi, Andrea Vedaldi, Andrew Zisserman, et al. 2015. Deep face recognition. In Proceedings of the British Machine Vision Conference (BMVC’15), Vol. 1, 6.
[27]
Adam Paszke, Sam Gross, Francisco Massa, Adam Lerer, James Bradbury, Gregory Chanan, Trevor Killeen, Zeming Lin, Natalia Gimelshein, Luca Antiga, Alban Desmaison, Andreas Kopf, Edward Yang, Zachary DeVito, Martin Raison, Alykhan Tejani, Sasank Chilamkurthy, Benoit Steiner, Lu Fang, Junjie Bai, and Soumith Chintala. 2019. PyTorch: An imperative style, high-performance deep learning library. In Advances in Neural Information Processing Systems 32, H. Wallach, H. Larochelle, A. Beygelzimer, F. d'Alché-Buc, E. Fox, and R. Garnett (Eds.). Curran Associates, 8024–8035. Retrieved from http://papers.neurips.cc/paper/9015-pytorch-an-imperative-style-high-performance-deep-learning-library.pdf.
[28]
Luis Perez and Jason Wang. 2017. The effectiveness of data augmentation in image classification using deep learning. Retrieved from http://arxiv.org/abs/1712.04621.
[29]
Ilyas Potamitis, Panagiotis Eliopoulos, and Iraklis Rigakis. 2017. Automated remote insect surveillance at a global scale and the Internet of Things. Robotics 6, 3 (2017).
[30]
David Powers and Ailab. 2011. Evaluation: From precision, recall and F-measure to ROC, informedness, markedness and correlation. J. Mach. Learn. Technol. 2 (Jan. 2011), 2229–3981.
[31]
Thainan B. Remboski, William D. de Souza, Marilton S. de Aguiar, and Paulo R. Ferreira, Jr.2018. Identification of fruit fly in intelligent traps using techniques of digital image processing and machine learning. In Proceedings of the 33rd Annual ACM Symposium on Applied Computing (SAC’18). ACM, New York, NY, 260–267.
[32]
Frank Rosenblatt. 1958. The perceptron: A probabilistic model for information storage and organization in the brain.Psychol. Rev. 65, 6 (1958), 386.
[33]
Frank Rosenblatt. 1961. Principles of Neurodynamics: Perceptrons and the Theory of Brain Mechanisms. Technical Report, Cornell Aeronautical Lab, Buffalo, NY.
[34]
Mark Sandler, Andrew G. Howard, Menglong Zhu, Andrey Zhmoginov, and Liang-Chieh Chen. 2018. Inverted residuals and linear bottlenecks: Mobile networks for classification, detection and segmentation. Retrieved from http://arxiv.org/abs/1801.04381.
[35]
A. M. Shelton and F. R. Badenes-Perez. 2006. Concepts and applications of trap cropping in pest management. Annu. Rev. Entomol. 51, 1 (2006), 285–308.
[36]
L. O. Solis-Sánchez, J. J. García-Escalante, R. Castañeda-Miranda, I. Torres-Pacheco, and R. Guevara-González. 2009. Machine vision algorithm for whiteflies (Bemisia tabaci Genn.) scouting under greenhouse environment. J. Appl. Entomol. 133, 7 (2009), 546–552.
[37]
Pierre Stock, Armand Joulin, Rémi Gribonval, Benjamin Graham, and Hervé Jégou. 2019. And the bit goes down: Revisiting the quantization of neural networks. Retrieved from http://arxiv.org/abs/1907.05686.
[38]
Mingxing Tan, Bo Chen, Ruoming Pang, Vijay Vasudevan, and Quoc V. Le. 2018. MnasNet: Platform-aware neural architecture search for Mobile. Retrieved from http://arxiv.org/abs/1807.11626.
[39]
P. Tirelli, N. A. Borghese, F. Pedersini, G. Galassi, and R. Oberti. 2011. Automatic monitoring of pest insects traps by Zigbee-based wireless networking of image sensors. In Proceedings of the Instrumentation and Measurement Technology Conference (I2MTC’11). IEEE, 1–5.
[40]
Chenglu Wen and Daniel Guyer. 2012. Image-based orchard insect automated identification and classification method. Comput. Electron. Agric. 89 (Nov. 2012), 110–115.
[41]
S. Xie, R. Girshick, P. Dollár, Z. Tu, and K. He. 2017. Aggregated residual transformations for deep neural networks. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR’17). 5987–5995.
[42]
X. Zhang, X. Zhou, M. Lin, and J. Sun. 2018. ShuffleNet: An extremely efficient convolutional neural network for mobile devices. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition. 6848–6856.

Cited By

View all
  • (2025)Development of an autonomous smart trap for precision monitoring of hematophagous flies on cattleSmart Agricultural Technology10.1016/j.atech.2025.100842(100842)Online publication date: Feb-2025
  • (2024)Enhancing Fruit Fly Detection in Complex Backgrounds Using Transformer Architecture with Step Attention MechanismAgriculture10.3390/agriculture1403049014:3(490)Online publication date: 18-Mar-2024
  • (2024)Insect Pest Trap Development and DL-Based Pest Detection: A Comprehensive ReviewIEEE Transactions on AgriFood Electronics10.1109/TAFE.2024.34364702:2(323-334)Online publication date: Sep-2024
  • Show More Cited By

Index Terms

  1. Deep Learning Embedded into Smart Traps for Fruit Insect Pests Detection

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Transactions on Intelligent Systems and Technology
    ACM Transactions on Intelligent Systems and Technology  Volume 14, Issue 1
    February 2023
    487 pages
    ISSN:2157-6904
    EISSN:2157-6912
    DOI:10.1145/3570136
    • Editor:
    • Huan Liu
    Issue’s Table of Contents

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 09 November 2022
    Online AM: 30 July 2022
    Accepted: 19 July 2022
    Revised: 08 April 2022
    Received: 23 July 2021
    Published in TIST Volume 14, Issue 1

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. Intelligent traps
    2. deep learning
    3. model compression
    4. integrated pest management
    5. pests

    Qualifiers

    • Research-article
    • Refereed

    Funding Sources

    • CAPES, Finance Code 001
    • Google LARA 2018
    • CNPq (DT-II)

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)93
    • Downloads (Last 6 weeks)12
    Reflects downloads up to 02 Mar 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2025)Development of an autonomous smart trap for precision monitoring of hematophagous flies on cattleSmart Agricultural Technology10.1016/j.atech.2025.100842(100842)Online publication date: Feb-2025
    • (2024)Enhancing Fruit Fly Detection in Complex Backgrounds Using Transformer Architecture with Step Attention MechanismAgriculture10.3390/agriculture1403049014:3(490)Online publication date: 18-Mar-2024
    • (2024)Insect Pest Trap Development and DL-Based Pest Detection: A Comprehensive ReviewIEEE Transactions on AgriFood Electronics10.1109/TAFE.2024.34364702:2(323-334)Online publication date: Sep-2024
    • (2024)Comparison of ethanol-baited trap designs for ambrosia beetles in orchards in the eastern United StatesJournal of Economic Entomology10.1093/jee/toae145117:4(1476-1484)Online publication date: 28-Jun-2024
    • (2024)Field detection of pests based on adaptive feature fusion and evolutionary neural architecture searchComputers and Electronics in Agriculture10.1016/j.compag.2024.108936221(108936)Online publication date: Jun-2024
    • (2023)Comparative Study of Camera- and Sensor-Based Traps for Insect Pest Monitoring Applications2023 IEEE Conference on AgriFood Electronics (CAFE)10.1109/CAFE58535.2023.10291672(55-59)Online publication date: 25-Sep-2023

    View Options

    Login options

    Full Access

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Full Text

    View this article in Full Text.

    Full Text

    HTML Format

    View this article in HTML Format.

    HTML Format

    Figures

    Tables

    Media

    Share

    Share

    Share this Publication link

    Share on social media