Skip to main content

SIFT-Guided Saliency-Based Augmentation for Weed Detection in Grassland Images: Fusing Classic Computer Vision with Deep Learning

  • Conference paper
  • First Online:
Computer Vision Systems (ICVS 2023)

Abstract

Weed detection is a challenging case within object detection as the weed targets do not generally strike out from the background in terms of color. This paper investigates how the density of structural features can be used to assist the training process of a Deep-Learning-based object detector. SIFT keypoint density is used to create overlay masks to augment images, emphasizing low-density areas—typically corresponding to weed plants. Our method is shown to improve detection \(mAP_{.5:.05:.95}\) on the YOLOR-CSP detector by up to 0.0215.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

Notes

  1. 1.

    https://github.com/WongKinYiu/yolor/blob/main/data/hyp.scratch.640.yaml.

References

  1. Ahmed, F., Kabir, M.H., Bhuyan, S., Bari, H., Hossain, E.: Automated weed classification with local pattern-based texture descriptors. Int. Arab J. Inf. Technol. 11, 87–94 (2014)

    Google Scholar 

  2. Anken, T., Šeatović, D., Holpp, M., Venn, W., Kutterer, H.: Automatic detection of broad-leaved dock in grassland (2010)

    Google Scholar 

  3. Binch, A., Cooke, N., Fox, C.W.: Rumex and Urtica detection in grassland by UAV. In: 14th International Conference on Precision Agriculture (2018)

    Google Scholar 

  4. Bochkovskiy, A., Wang, C.Y., Liao, H.Y.M.: YOLOv4: optimal speed and accuracy of object detection. arXiv:2004.10934 [cs, eess] (2020)

  5. Choi, J., Lee, C., Lee, D., Jung, H.: SalfMix: a novel single image-based data augmentation technique using a saliency map. Sensors 21(24), 8444 (2021). https://doi.org/10.3390/s21248444, https://www.mdpi.com/1424-8220/21/24/8444

  6. Dürr, L., Anken, T., Bollhalder, H., Sauter, J., Burri, K.G., Kuhn, D.: Machine vision detection and microwave-based elimination of Rumex obtusifolius L. on grassland (2008)

    Google Scholar 

  7. Šeatović, D., Winterthur, Z., Switzerland: 3D-object recognition, localization and treatment of Rumex obtusifolius in its natural environment. In: International Conference on Precision Agriculture (2008)

    Google Scholar 

  8. van Evert, F.K., Polder, G., van der Heijden, G.W.A.M., Kempenaar, C., Lotz, L.A.P.: Real-time vision-based detection of Rumex obtusifolius in grassland. Weed Res. 49(2), 164–174 (2009)

    Article  Google Scholar 

  9. FAO: fAO. FAOSTAT Statistical Database. License: CC BY-NC-SA 3.0 IGO (2021). https://www.fao.org/faostat/en/. Accessed 04 July 2022

  10. Güldenring, R., Boukas, E., Ravn, O., Nalpantidis, L.: Few-leaf learning: weed segmentation in grasslands. In: 2021 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) (2021)

    Google Scholar 

  11. Güldenring, R., van Evert, F.K., Nalpantidis, L.: RumexWeeds: a grassland dataset for agricultural robotics. J. Field Robot. (2023). https://doi.org/10.1002/rob.22196, https://onlinelibrary.wiley.com/doi/abs/10.1002/rob.22196

  12. Güldenring, R., Nalpantidis, L.: Self-supervised contrastive learning on agricultural images. Comput. Electron. Agric. 191, 106510 (2021)

    Article  Google Scholar 

  13. Johnson, Q., VanGessel, M., Taylor, R.W.: Pasture and hay weed management guide delaware 2015. Technical report, University of Delaware (2015). https://s3.amazonaws.com/udextension/ag/files/2015/01/PHWeedguide.pdf

  14. Kounalakis, T., Triantafyllidis, G.A., Nalpantidis, L.: Weed recognition framework for robotic precision farming. In: 2016 IEEE International Conference on Imaging Systems and Techniques (IST), pp. 466–471 (2016). https://doi.org/10.1109/IST.2016.7738271

  15. Kounalakis, T., Triantafyllidis, G.A., Nalpantidis, L.: Image-based recognition framework for robotic weed control systems. Multimed. Tools Appl. 77(8), 9567–9594 (2018)

    Article  Google Scholar 

  16. Kounalakis, T., Triantafyllidis, G.A., Nalpantidis, L.: Deep learning-based visual recognition of Rumex for robotic precision farming. Comput. Electron. Agric. 165, 104973 (2019)

    Article  Google Scholar 

  17. Lam, O.H.Y., et al.: An open source workflow for weed mapping in native grassland using unmanned aerial vehicle: using Rumex obtusifolius as a case study. Eur. J. Remote Sens. 54(sup1), 71–88 (2021)

    Article  Google Scholar 

  18. Lowe, D.G.: Object recognition from local scale-invariant features. In: Proceedings of the Seventh IEEE International Conference on Computer Vision, vol. 2, pp. 1150–1157. IEEE (1999)

    Google Scholar 

  19. Polder, G., et al.: Weed detection using textural image analysis. in: EFITA/ WCCA Conference (2007)

    Google Scholar 

  20. Schori, D., Anken, T., Šeatović, D.: Using fully convolutional networks for Rumex obtusifolius segmentation, a preliminary report. In: 2019 International Symposium ELMAR, pp. 119–122 (2019)

    Google Scholar 

  21. Uddin, A.F.M.S., Monira, M.S., Shin, W., Chung, T., Bae, S.H.: SaliencyMix: a saliency guided data augmentation strategy for better regularization (2021). arXiv:2006.01791 [cs, stat]

  22. Valente, J., Doldersum, M., Roers, C., Kooistra, L.: Detecting Rumex obtusifolius weed plants in grasslands from UAV RGB imagery using deep learning. ISPRS Ann. Photogram. Remote Sens. Spat. Inf. Sci. IV-2/W5, 179–185 (2019)

    Google Scholar 

  23. Wang, C.Y., Yeh, I.H., Liao, H.Y.M.: You only learn one representation: unified network for multiple tasks. arXiv preprint arXiv:2105.04206 (2021)

  24. Wang, C.Y., Yeh, I.H., Liao, H.Y.M.: You only learn one representation: unified network for multiple tasks. arXiv:2105.04206 [cs] (2021)

  25. Yun, S., Han, D., Oh, S.J., Chun, S., Choe, J., Yoo, Y.: CutMix: regularization strategy to train strong classifiers with localizable features (2019)

    Google Scholar 

  26. Zhang, H., Cisse, M., Dauphin, Y.N., Lopez-Paz, D.: Mixup: beyond empirical risk minimization (2018)

    Google Scholar 

  27. Zhang, W., et al.: Broad-leaf weed detection in pasture. In: 2018 IEEE 3rd International Conference on Image, Vision and Computing (ICIVC), pp. 101–105 (2018)

    Google Scholar 

Download references

Acknowledgements

This work has been supported by the European Commission and European GNSS Agency through the project “Galileo-assisted robot to tackle the weed Rumex obtusifolius and increase the profitability and sustainability of dairy farming (GALIRUMI)”, H2020-SPACE-EGNSS-2019-870258.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Patrick Schmidt .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Schmidt, P., Güldenring, R., Nalpantidis, L. (2023). SIFT-Guided Saliency-Based Augmentation for Weed Detection in Grassland Images: Fusing Classic Computer Vision with Deep Learning. In: Christensen, H.I., Corke, P., Detry, R., Weibel, JB., Vincze, M. (eds) Computer Vision Systems. ICVS 2023. Lecture Notes in Computer Science, vol 14253. Springer, Cham. https://doi.org/10.1007/978-3-031-44137-0_12

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-44137-0_12

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-44136-3

  • Online ISBN: 978-3-031-44137-0

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics