Skip to main content

Selecting Informative Samples for Animal Recognition in the Wildlife

  • Conference paper
  • First Online:
Intelligent Decision Technologies 2019

Abstract

Observations in the wildlife using cameras traps are very useful in ecological, conservation, and behavioral research of animals and birds. However, a large number of recorded images do not contain the objects of interest, and manual removal of such images is a highly difficult and durable process. We suggest an automatic selection of relevant images in order to prepare the informative samples for following animal recognition and a set of representative images for manual detailed analysis if it is necessary. In this research, we propose two methods based on the background model constructed “on the fly” and Gaussian mixture model. The distorted images by visual artifacts are removed preliminary. The experiments were conducted on 30,000 images captured by camera traps in Ergaki national park, Krasnoyarsky Kray, Russia, 2012–2018. The best accuracy result for selecting informative samples achieved 96% regarding the human estimates.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 169.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 219.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 219.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. O’Connell, A.F., Nichols, J.D., Karanth, K.U.: Camera Traps in Animal Ecology: Methods and Analyses. Springer Science & Business Media (2010)

    Google Scholar 

  2. Butler, D.A., Meek, P.: Camera trapping and invasions of privacy: an Australian legal perspective. Torts Law J. 20, 235–264 (2013)

    Google Scholar 

  3. Newey, S., Davidson, P., Nazir, S., Fairhurst, G., Verdicchio, F., Irvine, R.J., van der Wal, R.: Limitations of recreational camera traps for wildlife management and conservation research: a practitioner’s perspective. Ambio 44, 624–635 (2015)

    Article  Google Scholar 

  4. Harris, G., Thompson, R., Childs, J.L., Sanderson, J.G.: Automatic storage and analysis of camera trap data. Bull. Ecol. Soc. Am. 91(3), 352–360 (2010)

    Article  Google Scholar 

  5. Sundaresan, S.R., Riginos, C., Abelson, E.S.: Management and analysis of camera trap data: alternative approaches. Bull. Ecol. Soc. Am. 92(2), 188–195 (2011)

    Article  Google Scholar 

  6. Garcia-Molina, H.: PhotoSpread: a spreadsheet for managing photos. In: IEEE 24th International Conference on Data Engineering, pp. 1749–1758. Cancún, México (2008)

    Google Scholar 

  7. Camera Base: Created by Tobler M. http://www.atrium-biodiversity.org/tools/camerabase/. Retrieved 31 Dec 2018

  8. Fegraus, E.H., Lin, K., Ahumada, J.A., Baru, C., Chandra, S., Youn, C.: Data acquisition and management software for camera trap data: A case study from the TEAM network. Ecol. Inform. 6(6), 345–353 (2011)

    Article  Google Scholar 

  9. Barrueto, M., Clevenger, A.P., Dorsey, B., Ford, A.T.: A better solution for photo classification, automatic storage and data input of camera data from wildlife crossing structures. In: International Conference on Ecology and Transportation, pp. 1–11. Scottsdale, Arizona, USA (2013)

    Google Scholar 

  10. Krishnappa, Y.S., Turner, W.C.: Software for minimalistic data management in large camera trap studies. Ecol. Inform. 24, 11–16 (2014)

    Article  Google Scholar 

  11. Zaragozí, B., Belda, A., Giménez, P., Navarro, J.T., Bonet, A.: Advances in camera trap data management tools: Towards collaborative development and integration with GIS. Ecol. Inform. 30, 6–11 (2015)

    Article  Google Scholar 

  12. Young, S., Rode-Margono, J., Amin, R.: Software to facilitate and streamline camera trap data management: a review. Ecol. Eval. 8(19), 9947–9957 (2018)

    Article  Google Scholar 

  13. Swinnen, K.R.R., Reijniers, J., Breno, M., Leirs, H.: A novel method to reduce time investment when processing videos from camera trap studies. PLoS ONE 9(6), e98881.1–e98881.7 (2014)

    Article  Google Scholar 

  14. Norouzzadeh, M.S., Nguyen, A., Kosmala, M., Swanson, A., Packer, C., Clune, J.: Automatically identifying wild animals in camera trap images with deep learning. Natl. Acad. Sci. 115(25), 5716–5725 (2018)

    Article  Google Scholar 

  15. Enari, H., Enari, H.S., Okuda, K., Maruyama, T., Okuda, K.N.: An evaluation of the efficiency of passive acoustic monitoring in detecting deer and primates in comparison with camera traps. Ecol. Ind. 98, 753–762 (2019)

    Article  Google Scholar 

  16. Favorskaya, M.N., Proskurin, A.V.: No-reference quality assessment of blurred frames. Procedia Comput. Sci. 126, 917–926 (2018)

    Article  Google Scholar 

  17. Favorskaya, M.N., Buryachenko, V.V.: Background extraction method for analysis of natural images captured by camera traps. Inf. Control Syst./Inf.-Upr. Sist. 97(6), 35–45 (2018)

    Google Scholar 

  18. He, K., Sun, J., Xiaoou Tang, X.: Single image haze removal using dark channel prior. IEEE Trans. Pattern Anal. Mach. Intell. 33(12), 2341–2353 (2011)

    Article  Google Scholar 

  19. Castelli, R., Frolkovic, P., Reinhardt, C, Stolk, C.C., Tomczyk, J., Vromans, A.: Fog detection from camera images. In: Workshop Study Group Mathematics with Industry, pp. 25–43. Nijmegen, The Netherlands (2016)

    Google Scholar 

  20. Zotin, A.: Fast algorithm of image enhancement based on multi-scale Retinex. In: 8th International Congress of Information and Communication Technology, pp. 6–14. Nanning, China (2018)

    Article  Google Scholar 

  21. Zhang, X., Li, H., Qi, Y., Leow, W.K., Ng, T.K.: Rain removal in video by combining temporal and chromatic properties. In: IEEE International Conference on Multimedia Expo, pp. 461–464. Seattle, WA, USA (2006)

    Google Scholar 

  22. Kim, H.G., Seo, S.J., Song, B.C.: Multi-frame de-raining algorithm using a motion-compensated non-local mean filter for rainy video sequences. J. Vis. Commun. Image Represent. 26, 317–328 (2015)

    Article  Google Scholar 

  23. Barnum, P., Narasimhan, G., Kanade, T.: Analysis of rain and snow in frequency space. Int. J. Comput. Vision 86(2–3), 256–274 (2009)

    Google Scholar 

Download references

Acknowledgements

The reported study was funded by Russian Foundation for Basic Research, Government of Krasnoyarsk Territory, Krasnoyarsk Regional Fund of Science, to the research project No 18-47-240001.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Margarita Favorskaya .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2019 Springer Nature Singapore Pte Ltd.

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Favorskaya, M., Buryachenko, V. (2019). Selecting Informative Samples for Animal Recognition in the Wildlife. In: Czarnowski, I., Howlett, R., Jain, L. (eds) Intelligent Decision Technologies 2019. Smart Innovation, Systems and Technologies, vol 143. Springer, Singapore. https://doi.org/10.1007/978-981-13-8303-8_6

Download citation

Publish with us

Policies and ethics