Skip to main content
Log in

Pest species identification algorithm based on improved YOLOv4 network

  • Original Paper
  • Published:
Signal, Image and Video Processing Aims and scope Submit manuscript

Abstract

Pest species recognition suffers from the problems of easy loss of small targets, dense pest distribution, and low individual recognition rate. To improve the efficiency of pest monitoring, this paper proposed a pest species recognition algorithm called DF-YOLO based on the YOLOv4 network. First, the DenseNet network is introduced into the YOLOv4 backbone network CSPDarknet53 to enhance the feature extraction capability of the model and improve the individual recognition rate of densely distributed targets. Second, the focal loss function is used to improve the effect of sample imbalance on training and optimize the mining process of complex samples. The algorithm was tested on a homemade pest dataset. And the results show that the mAP (mean average precision) of the method is 94.89%, which is 4.66% higher than that before improvement, and the detection speed is 18.92 f/s, which is better than the mainstream algorithm. Experiments were conducted under different light sources, different backgrounds, and different densities. The mAP of our method is 93.96% with an accuracy within 1%, which proves the high robustness of the method. This study improves the intelligence of pest control, and the proposed method is more effective in pest species identification compared with other methods, which can meet the needs of practical use.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12

Similar content being viewed by others

Data availability

Our homemade dataset contains 1360 different pest images with a total of 15,300 enhanced images, divided into a training set, a validation set, and a test set in a ratio of 8:1:1. The datasets used in the current study can be obtained from the corresponding authors upon reasonable request.

References

  1. Elkhateeb, W.A., Mousa, K.M., ELnahas, M.O., Daba, G.M.: Fungi against insects and contrariwise as biological control models. Egypt. J. Biol. Pest Control 31(1), 1–9 (2021). https://doi.org/10.1186/s41938-020-00360-8

    Article  Google Scholar 

  2. Arun, A.R., Umamaheswari, S.: Effective and efficient multi-crop pest detection based on deep learning object detection models. Intell. Fuzzy Syst. 43(4), 1–19 (2022)

    Google Scholar 

  3. Zhao, Zh.Q., Zheng, P., Xu, Sh.T., Wu, X.D.: Object detection with deep learning: a review. IEEE Trans. Neural Netw. Learn. Syst. 30(11), 3212–3232 (2019)

    Article  Google Scholar 

  4. Stefopoulou, A., Maselou, D.A., Papachristos, D., Kolimenakis, A., Michaelakis, A., Athanassiou, C., Vlontzos, G.: Pest control in primary sector: towards the identification of knowledge gaps. Agronomy 11(8), 1596 (2021). https://doi.org/10.3390/agronomy11081596

    Article  Google Scholar 

  5. Amrani, A., Sohel, F., Diepeveen, D., Murray, D., Jones, M.G.K.: Insect detection from imagery using YOLOv3-based adaptive feature fusion convolution network. Crop Pasture Sci. (2022). https://doi.org/10.1071/CP21710

    Article  Google Scholar 

  6. Ding, W.J., Taylor, G.: Automatic moth detection from trap images for pest management. Comput. Electron. Agric. 123, 17–28 (2016). https://doi.org/10.1016/j.compag.2016.02.003

    Article  Google Scholar 

  7. Liu, Z.Y., Gao, J.F., Yang, G.G., Zhang, H., He, Y.: Localization and classification of paddy field pests using a saliency map and deep convolutional neural network. Sci. Rep. 6(1), 1–12 (2016). https://doi.org/10.1038/srep20410

    Article  Google Scholar 

  8. Song, L.M., Kang, J.W., Zhang, Q.L., Wang, Sh.P.: A weld feature points detection method based on improved YOLO for welding robots in strong noise environment. SIViP 16(8), 1–9 (2022). https://doi.org/10.1007/s11760-022-02391-0

    Article  Google Scholar 

  9. Li, D., Ahmed, F., Wu, N., Sethi, A.I.: YOLO-JD: a deep learning network for jute diseases and pests detection from images. Plants 11(7), 937 (2022). https://doi.org/10.3390/plants11070937

    Article  Google Scholar 

  10. Dong, S., Du, J., Jiao, L., Wang, F.M., Liu, K., Teng, Y., Wang, R.J.: Automatic crop pest detection oriented multiscale feature fusion approach. Insects 13(6), 554 (2022). https://doi.org/10.3390/insects13060554

    Article  Google Scholar 

  11. Redmon, J., Divvala, S., Girshick, R., Farhadi, A.: You only look once: unified, real-time object detection. In: Computer Vision and Pattern Recognition, pp. 779–788 (2016)

  12. Redmon, J., Ali, F.: Yolov3: an incremental improvement. arXiv preprint (2018). https://arxiv.org/abs/1804.02767

  13. Bochkovskiy, A., Wang, Ch.Y., Liao, H.Y.M.: Yolov4: optimal speed and accuracy of object detection. arXiv preprint (2020). https://arxiv.org/abs/2004.10934

  14. Wang, K., Jiang, P., Meng, J.: Attention-based DenseNet for pneumonia classification. IRBM. 43(5), 479–485 (2022). https://doi.org/10.1016/j.irbm.2021.12.004

    Article  Google Scholar 

  15. Dong, C.X., Zhao, Y., Zhang, G., Xue, M.R., Chu, D., He, J.T., Ge, X.T.: Attention-based graph ResNet with focal loss for epileptic seizure detection. Ambient Intell. Smart Environ. 14(1), 61–73 (2022). https://doi.org/10.3233/AIS-210086

    Article  Google Scholar 

  16. Das, H., Park, H.: MCU-less biphasic electrical stimulation circuit for miniaturized neuromodulator. Biomed. Eng. Lett. 12(3), 285–293 (2022). https://doi.org/10.1007/s13534-022-00239-7

    Article  Google Scholar 

  17. Liu, Ch., Wang, XCh., Wu, Q.L., Jiang, J.B.: Light weight target detection algorithm based on YOLOv4. Real-Time Image Process. 19, 1123–1137 (2022)

    Article  Google Scholar 

  18. He, K., Zhang, X., Ren, S., Sun, J.: Spatial pyramid pooling in deep convolutional networks for visual recognition. IEEE Trans. Pattern Anal. Mach. Intell. 37(9), 1904–1916 (2015). https://doi.org/10.1109/TPAMI.2015.238

    Article  Google Scholar 

  19. Fu, J., Chen, X., Lv, Z.: Rail fastener status detection based on MobileNet-YOLOv4. Electronics 11(22), 3677 (2022). https://doi.org/10.3390/electronics11223677

    Article  Google Scholar 

  20. Ren, S.Q., He, K., Girshick, R., Sun, J.: Faster r-cnn: towards real-time object detection with region proposal networks. Adv. Neural Inf. Process Syst. (2015). https://doi.org/10.1109/TPAMI.2016.2577031

    Article  Google Scholar 

  21. Vales, V.B., Fernández, O.C., Domínguez-Bolaño, T., Escudero, C.J., Garcia-Naya, J.A.: Fine time measurement for the Internet of things: a practical approach using ESP32. IEEE Internet Things J. 9(19), 18305–18318 (2022)

    Article  Google Scholar 

  22. Lin, T.-Y., Dollár, P., Girshick, R., He, K., Hariharan, B., Belongie, S.: Feature pyramid networks for object detection. In: Computer Vision and Pattern Recognition, pp. 936–944 (2017)

  23. Nayef, B.H., Abdullah, S.N.H.S., Sulaiman, R., Alyasseri, Z.A.A.: Optimized leaky ReLU for handwritten Arabic character recognition using convolution neural networks. Multimed. Tools Appl. 81(2), 2065–2094 (2022). https://doi.org/10.1007/s11042-021-11593-6

    Article  Google Scholar 

  24. Dan, Z.J., Zhao, Y., Bi, X.J., Wu, L.C., Ji, Q.: Multi-task transformer with adaptive cross-entropy loss for multi-dialect speech recognition. Entropy 24(10), 1429 (2022). https://doi.org/10.3390/e24101429

    Article  Google Scholar 

  25. Lin, T.Y., Goyal, P., Girshick, R.: Focal loss for dense object detection. IEEE Trans. Pattern Anal. Mach. Intell. (2017). https://doi.org/10.1109/ICCV.2017.324

    Article  Google Scholar 

  26. Girshick, R., Donahue, J., Darrell, T., Malik, J.: Region-based convolutional networks for accurate object detection and segmentation. IEEE Trans. Pattern Anal. Mach. Intell. 38(1), 142–158 (2015). https://doi.org/10.1109/TPAMI.2015.2437384

    Article  Google Scholar 

Download references

Funding

This work was supported by the Program for Innovative Research Team in University of Tianjin (No. TD13-5036) and the Tianjin Science and Technology Popularization Project (No. 22KPXMRC00090).

Author information

Authors and Affiliations

Authors

Contributions

LS and ML completed the main manuscript text and experiments. SL and HW prepared Table 2. JL prepared Table 3. All authors reviewed the manuscript.

Corresponding author

Correspondence to Limei Song.

Ethics declarations

Conflict of interest

The authors declare that they have no conflict of interests.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Song, L., Liu, M., Liu, S. et al. Pest species identification algorithm based on improved YOLOv4 network. SIViP 17, 3127–3134 (2023). https://doi.org/10.1007/s11760-023-02534-x

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11760-023-02534-x

Keywords

Navigation