Skip to main content
Log in

A deep learning-based approach for real-time rodent detection and behaviour classification

  • Published:
Multimedia Tools and Applications Aims and scope Submit manuscript

Abstract

Animal models are helpful to evaluate the effects of some drugs in the treatment of brain diseases, such as the case of the Open Field Maze. Usually, these tests are recorded in video and analysed afterwards to carry out manual annotations about the activity and behaviour of the rat. Usually, these videos must be watched repeatedly to ensure correct annotations, but they are prone to become a tedious task and are highly likely to produce human errors. Existing commercial systems for automatic detection of the rat and classification of its behaviours may become inaccessible for research teams that cannot afford the license cost. Motivated by the latter, we propose a methodology for simultaneous rat detection and behaviour classification using inexpensive hardware in this work. Our proposal is a Deep Learning-based two-step methodology to simultaneously detect the rat in the test and classify its behaviour. In the first step, a single shot detector network is used to detect the rat; then, the systems crop the image using the bounding box to generate a sequence of six images that input our BehavioursNet network to classify the rodent’s behaviour. Finally, based on the results of these steps, the system generates an ethogram for the complete video, a trajectory plot, a heatmap plot for most visited regions and a video showing the rat’s detection and its behaviours. Our results show that it is possible to perform these tasks at a processing rate of 23 Hz, with a low error of 6 pixels in the detection and a first approach to classify ambiguous behaviours such as resting and grooming, with an average precision of 60%, which is competitive with that reported in the literature.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9

Similar content being viewed by others

Notes

  1. This video can be found at the following link https://drive.google.com/drive/folders/1AE7mcsj2avXcD8zJ_iCp2eR405bO9F3r?usp=sharing

References

  1. Arac A, Zhao P, Dobkin BH, Carmichael ST, Golshani P (2019) Deepbehavior: A deep learning toolbox for automated analysis of animal and human behavior imaging data. Front Syst Neurosci 13:20. https://doi.org/10.3389/fnsys.2019.00020

    Article  Google Scholar 

  2. Brenes JC, Padilla M, Fornaguera J (2009) A detailed analysis of open-field habituation and behavioral and neurochemical antidepressant-like effects in postweaning enriched rats. Behav Brain Res 197(1):125–137. https://doi.org/10.1016/j.bbr.2008.08.014

    Article  Google Scholar 

  3. Bryda E (2013) The mighty mouse: The impact of rodents on advances in biomedical research. Missouri medicine 110:207–11

    Google Scholar 

  4. Chanchanachitkul W, Nanthiyanuragsa P, Rodamporn S, Thongsaard W, Charoenpong T (2013) A rat walking behavior classification by body length measurement. In: The 6th 2013 biomedical engineering international conference. https://doi.org/10.1109/BMEiCon.2013.6687670, pp 1–5

  5. Cocoma-Ortega J, Martinez-Carranza J (2021) A compact cnn approach for drone localisation in autonomous drone racing. Journal of Real-Time Image Processing. https://doi.org/10.1007/s11554-021-01162-3

  6. Cocoma-Ortega JA, Martinez-Carranza J (2019) Towards a rodent tracking and behaviour detection system in real time. In: Pattern Recognition. Springer International Publishing, Cham, pp 159–169

  7. da Silva Aragão R, Rodrigues MAB, de Barros KMFT, Silva SRF, Toscano AE, de Souza RE, de Castro RM (2011) Automatic system for analysis of locomotor activity in rodents—a reproducibility study. J Neurosci Methods 195(2):216–221. https://doi.org/10.1016/j.jneumeth.2010.12.016

    Article  Google Scholar 

  8. da Silva Monteiro JP (2012) Automatic behavior recognition in laboratory animals using kinect, Faculdade de Engenharia da Universidade do Porto

  9. Dai Z, Liu H, Le QV, Tan M (2021) Coatnet: Marrying convolution and attention for all data sizes. In: Thirty-Fifth conference on neural information processing systems. https://openreview.net/forum?id=dUk5Foj5CLf

  10. de Menezes R, Luiz JV, Henrique-Alves A, Cruz RS, Maia H (2020) Mice tracking using the yolo algorithm. In: Anais do XLVII Seminário Integrado de Software e Hardware. https://sol.sbc.org.br/index.php/semish/article/view/11326. SBC, Porto Alegre, pp 162–173

  11. Geuther BQ, Deats SP, Fox KJ, Murray SA, Braun RE, White JK, Chesler EJ, Lutz CM, Kumar V (2018) Robust mouse tracking in complex environments using neural networks. bioRxiv. https://doi.org/10.1101/336685

  12. Geuther BQ, Peer A, He H, Sabnis G, Philip VM, Kumar V (2021) Action detection using a neural network elucidates the genetics of mouse grooming behavior. eLife 10:63207. https://doi.org/10.7554/eLife.63207

    Article  Google Scholar 

  13. Giancardo L, Sona D, Scheggia D, Papaleo F, Murino V (2012) Segmentation and tracking of multiple interacting mice by temperature and shape information. In: Proceedings of the 21st International Conference on Pattern Recognition (ICPR2012), pp 2520–2523

  14. Gianelli S, Harland B, Fellous JM (2018) A new rat-compatible robotic framework for spatial navigation behavioral experiments. J Neurosci Methods 294:40–50. https://doi.org/10.1016/j.jneumeth.2017.10.021

    Article  Google Scholar 

  15. Giulian D, Silverman G (1975) Solid-state animal detection system: Its application to open field activity and freezing behavior. Physiol Behav 14(1):109–112. https://doi.org/10.1016/0031-9384(75)90150-X

    Article  Google Scholar 

  16. Gomez-Marin A, Partoune N, Stephens GJ, Louis M (2012) Automated tracking of animal posture and movement during exploration and sensory orientation behaviors. PLOS ONE 7(8):1–9. https://doi.org/10.1371/journal.pone.0041642

    Article  Google Scholar 

  17. Heredia-López FJ, May-Tuyub RM, Bata-García JL, Góngora-Alfaro JL, Álvarez-Cervera FJ (2013) A system for automatic recording and analysis of motor activity in rats. Behav Res Methods 45(1):183–190

    Article  Google Scholar 

  18. Higaki A, Mogi M, Iwanami J, Min LJ, Bai HY, Shan BS, Kan-no H, Ikeda S, Higaki J, Horiuchi M (2018) Recognition of early stage thigmotaxis in morris water maze test with convolutional neural network. PLOS ONE 13(5):1–11. https://doi.org/10.1371/journal.pone.0197003

    Article  Google Scholar 

  19. Hånell A, Marklund N (2014) Structured evaluation of rodent behavioral tests used in drug discovery research. Front Behav Neurosci 8:252. https://doi.org/10.3389/fnbeh.2014.00252

    Article  Google Scholar 

  20. Hong W, Kennedy A, Burgos-Artizzu XP, Zelikowsky M, Navonne SG, Perona P, Anderson DJ (2015) Automated measurement of mouse social behaviors using depth sensing, video tracking, and machine learning. Proc Natl Acad Sci 112(38):E5351–E5360. https://doi.org/10.1073/pnas.1515982112

    Article  Google Scholar 

  21. Howerton CL, Garner JP, Mench JA (2012) A system utilizing radio frequency identification (rfid) technology to monitor individual rodent behavior in complex social settings. J Neurosci Methods 209(1):74–78. https://doi.org/10.1016/j.jneumeth.2012.06.001

    Article  Google Scholar 

  22. Jia Y, Wang Z, Canales D, Tinkler M, Hsu C, Madsen TE, Mirbozorgi SA, Rainnie D, Ghovanloo M (2016) A wirelessly-powered homecage with animal behavior analysis and closed-loop power control. In: 2016 38th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), pp 6323–6326

  23. Jin T, Duan F (2019) Rat behavior observation system based on transfer learning. IEEE Access 7:62152–62162. https://doi.org/10.1109/ACCESS.2019.2916339

    Article  Google Scholar 

  24. Jin T, Duan F (2019) Rat behavior observation system based on transfer learning. IEEE Access 7:62152–62162. https://doi.org/10.1109/ACCESS.2019.2916339

    Article  Google Scholar 

  25. Kalueff A, Aldridge J, LaPorte J, Murphy D, Tuohimaa P (2007) Analyzing grooming microstructure in neurobehavioral experiments. Nat Protoc 2:2538–44. https://doi.org/10.1038/nprot.2007.367

    Article  Google Scholar 

  26. Kalueff A, Stewart A, Song C, Berridge K, Graybiel A, Fentress J (2015) Neurobiology of rodent self-grooming and its value for translational neuroscience. Nat Rev Neurosci 17:45–59. https://doi.org/10.1038/nrn.2015.8

    Article  Google Scholar 

  27. Kalueff A, Tuohimaa P (2005) The grooming analysis algorithm discriminates between different levels of anxiety in rats: Potential utility for neurobehavioural stress research. J Neurosci Methods 143:169–77. https://doi.org/10.1016/j.jneumeth.2004.10.001

    Article  Google Scholar 

  28. Kim JH, Hong GS, Kim BG, Dogra DP (2018) deepgesture: Deep learning-based gesture recognition scheme using motion sensors. Displays 55:38–45. https://doi.org/10.1016/j.displa.2018.08.001. Advances in Smart Content-Oriented Display Technology

    Article  Google Scholar 

  29. Kobayashi K, Matsushita S, Shimizu N, Masuko S, Yamamoto M, Murata T (2021) Automated detection of mouse scratching behaviour using convolutional recurrent neural network. Sci Rep 11(1):1–10

    Article  Google Scholar 

  30. Kraeuter A-K, Guest P C, Sarnyai Z (2019) The open field test for measuring locomotor activity and anxiety-like behavior. In: Pre-clinical models. Springer, pp 99–103

  31. Krizhevsky A, Sutskever I, Hinton G (2012) Imagenet classification with deep convolutional neural networks. Neural Information Processing Systems 25

  32. Kumar M, Bansal M, Kumar M (2020) 2d object recognition techniques: State-of-the-art work. Archives of Computational Methods in Engineering 28

  33. Kumar M, Bansal M, Saluja K (2021) An efficient technique for object recognition using shi-tomasi corner detection algorithm. Soft Computing 25

  34. Kumar M, Chhabra P, Garg N (2018) An efficient content based image retrieval system using bayesnet and k-nn. Multimed Tools Appl 77:21557–21570. https://doi.org/10.1007/s11042-017-5587-8

    Article  Google Scholar 

  35. Kumar M, Chhabra P, Garg N (2020) Content-based image retrieval system using orb and sift features. Neural Computing and Applications 32

  36. Kumar M, Garg D, Garg N (2018) Underwater image enhancement using blending of clahe and percentile methodologies. Multimedia Tools and Applications 77

  37. Lai PL, Basso DM, Fisher LC, Sheets AL (2011) 3 d tracking of mouse locomotion using shape-from-silhouette techniques

  38. Lamprea M, Cardenas F, Setem J, Morato S (2008) Thigmotactic responses in an open-field. Braz J Med Biol Res = Revista brasileira de pesquisas mdicas e biolgicas / Sociedade Brasileira de Biofsica ... [et al] 41:135–40. https://doi.org/10.1590/S0100-879X2008000200010010

    Article  Google Scholar 

  39. Lee CC, Gao WW, Lui PW (2019) Rat grooming behavior detection with two-stream convolutional networks. In: 2019 Ninth International Conference on Image Processing Theory, Tools and Applications (IPTA), pp 1–5

  40. Linares-Sánchez LJ, Fernández-Alemán JL, García-Mateos G, Pérez-Ruzafa A, Sánchez-Vázquez FJ (2015) Follow-me: A new start-and-stop method for visual animal tracking in biology research. In: 2015 37th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), pp 755–758

  41. Liu W, Anguelov D, Erhan D, Szegedy C, Reed S, Fu CY, Berg AC (2016) Ssd: Single shot multibox detector. In: Leibe B, Matas J, Sebe N, Welling M (eds) Computer Vision – ECCV 2016. Springer International Publishing, Cham, pp 21–37

  42. Liu Z, Hu H, Lin Y, Yao Z, Xie Z, Wei Y, Ning J, Cao Y, Zhang Z, Dong L et al (2021) Swin transformer v2: Scaling up capacity and resolution. arXiv:2111.09883

  43. Lv X, Dai C, Chen L, Lang Y, Tang R, Huang Q, He J (2020) A robust real-time detecting and tracking framework for multiple kinds of unmarked object. Sensors 20(1)

  44. Macrì S, Mainetti L, Patrono L, Pieretti S, Secco A, Sergi I (2015Aug) A tracking system for laboratory mice to support medical researchers in behavioral analysis. In: 2015 37th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), pp 4946–4949

  45. Matsumoto J, Urakawa S, Takamura Y, Malcher-Lopes R, Hori E, Tomaz C, Ono T, Nishijo H (2013) A 3d-video-based computerized analysis of social and sexual interactions in rats. PLOS ONE 8(10):1–14. https://doi.org/10.1371/journal.pone.0078460

    Article  Google Scholar 

  46. Mazur-Milecka M, Kocejko T, Ruminski J (2020) Deep instance segmentation of laboratory animals in thermal images. Applied Sciences 10(17)

  47. Moulin TC, Covill LE, Itskov PM, Williams MJ, Schiöth HB (2021) Rodent and fly models in behavioral neuroscience: An evaluation of methodological advances, comparative research, and future perspectives. Neuroscience & Biobehavioral Reviews 120:1–12. https://doi.org/10.1016/j.neubiorev.2020.11.014

    Article  Google Scholar 

  48. O’Connor C, Heath DL, Cernak I, Nimmo AJ, Vink R (2003) Effects of daily versus weekly testing and pre-training on the assessment of neurologic impairment following diffuse traumatic brain injury in rats. J Neurotrauma 20(10):985–993. https://doi.org/10.1089/089771503770195830. PMID: 14588115

    Article  Google Scholar 

  49. Ohayon S, Avni O, Taylor AL, Perona P, Egnor SER (2013) Automated multi-day tracking of marked mice for the analysis of social behaviour. J Neurosci Methods 219(1):10–19. https://doi.org/10.1016/j.jneumeth.2013.05.013

    Article  Google Scholar 

  50. Ou-Yang TH, Tsai ML, Yen CT, Lin TT (2011) An infrared range camera-based approach for three-dimensional locomotion tracking and pose reconstruction in a rodent. J Neurosci Methods 201(1):116–123. https://doi.org/10.1016/j.jneumeth.2011.07.019

    Article  Google Scholar 

  51. Park SJ, Kim BG, Chilamkurti N (2021) A robust facial expression recognition algorithm based on multi-rate feature fusion scheme. Sensors 21(21)

  52. Prut L, Belzung C (2003) The open field as a paradigm to measure the effects of drugs on anxiety-like behaviors: a review. Eur J Pharmacol 463(1):3–33. https://doi.org/10.1016/S0014-2999(03)01272-X. Animal Models of Anxiety Disorders

    Article  Google Scholar 

  53. Redmon J, Divvala S, Girshick R, Farhadi A (2016) You only look once: Unified, real-time object detection. In: Proceedings of the IEEE conference on computer vision and pattern recognition, pp 779–788

  54. Ren Z, Annie AN, Ciernia V, Lee YJ (2017) Who moved my cheese? automatic annotation of rodent behaviors with convolutional neural networks. In: 2017 IEEE Winter Conference on Applications of Computer Vision (WACV), pp 1277–1286

  55. Rojas-Perez LO, Martinez-Carranza J (2020) Deeppilot: A cnn for autonomous drone racing. Sensors 20(16)

  56. Samson AL, Ju L, Kim HA, Zhang SR, Lee JAA, Sturgeon SA, Sobey CG, Jackson SP, Schoenwaelder SM (2015) Mousemove: an open source program for semi-automated analysis of movement and cognitive testing in rodents. In: Scientific reports

  57. Saré RM, Lemons A, Smith CB (2021) Behavior testing in rodents: Highlighting potential confounds affecting variability and reproducibility. Brain Sciences 11(4)

  58. Seibenhener M, Wooten M (2015) Use of the open field maze to measure locomotor and anxiety-like behavior in mice. Journal of visualized experiments : JoVE

  59. Sourioux M, Bestaven E, Guillaud E, Bertrand S, Cabanas M, Milan L, Mayo W, Garret M, Cazalets J-R (2018) 3-d motion capture for long-term tracking of spontaneous locomotor behaviors and circadian sleep/wake rhythms in mouse. J Neurosci Methods 295:51–57. https://doi.org/10.1016/j.jneumeth.2017.11.016

    Article  Google Scholar 

  60. Spink AJ, Tegelenbosch RAJ, Buma MOS, Noldus LPJJ (2001) The ethovision video tracking system—a tool for behavioral phenotyping of transgenic mice. Physiol Behav 73(5):731–744. https://doi.org/10.1016/S0031-9384(01)00530-3. Molecular Behavior Genetics of the Mouse

    Article  Google Scholar 

  61. Sturman O, Ziegler L, Schläppi C, Akyol F, Privitera M, Slominski D, Grimm C, Thieren L, Zerbi V, Grewe B, Bohacek J (2020) Deep learning-based behavioral analysis reaches human accuracy and is capable of outperforming commercial solutions. Neuropsychopharmacology 45

  62. Szegedy C, Wei Liu, Yangqing Jia, Sermanet P, Reed S, Anguelov D, Erhan D, Vanhoucke V, Rabinovich A (2015) Going deeper with convolutions. In: 2015 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), pp 1–9

  63. Tungtur S K, Nishimune N, Radel J, Nishimune H (2017) Mouse behavior tracker: An economical method for tracking behavior in home cages. BioTechniques 63(5):215–220. https://doi.org/10.2144/000114607. PMID: 29185921

    Article  Google Scholar 

  64. Valletta JJ, Torney C, Kings M, Thornton A, Madden J (2017) Applications of machine learning in animal behaviour studies. Anim Behav 124:203–220. https://doi.org/10.1016/j.anbehav.2016.12.005

    Article  Google Scholar 

  65. van Dam EA, Noldus LPJJ, van Gerven MAJ (2020) Deep learning improves automated rodent behavior recognition within a specific experimental setup. J Neurosci Methods 332:108536. https://doi.org/10.1016/j.jneumeth.2019.108536

    Article  Google Scholar 

  66. Vuralli D, Wattiez AS, Russo AF, Bolay H (2019) Behavioral and cognitive animal models in headache research. The Journal of Headache and Pain 20(1)

  67. Wang Z, Mirbozorgi SA, Ghovanloo M (2015) Towards a kinect-based behavior recognition and analysis system for small animals. In: 2015 IEEE Biomedical Circuits and Systems Conference (BioCAS), pp 1–4

  68. Whishaw IQ, Haun F, Kolb B (1999) . In: Windhorst U, Johansson H (eds) Analysis of behavior in laboratory rodents. Springer Berlin Heidelberg, Berlin, Heidelberg, pp 1243–1275, DOI https://doi.org/10.1007/978-3-642-58552-4_44, (to appear in print)

  69. Wilson JC, Kesler M, Pelegrin SLE, Kalvi L, Gruber A, Steenland HW (2015) Watching from a distance: A robotically controlled laser and real-time subject tracking software for the study of conditioned predator/prey-like interactions. J Neurosci Methods 253:78–89. https://doi.org/10.1016/j.jneumeth.2015.06.015

    Article  Google Scholar 

  70. Xie XS, Zhang J, Zou B, Xie J, Fang J, Zaveri NT, Khroyan TV (2012) . In: Chen J, Xu XM, Xu ZC, Zhang JH (eds) Rodent behavioral assessment in the home cage using the smartcage™ system. Humana Press, Totowa, NJ, pp 205–222

  71. Ziegelaar M (2015) Development of an inexpensive, user modifiable automated video tracking system for rodent behavioural tests. Master’s Thesis, School of Mechanical and Mining Engineering

Download references

Acknowledgements

First author thanks to Consejo Nacional de Ciencia y Tecnología (CONACYT) for the scholarship no. 719218

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Jose Martinez-Carranza.

Ethics declarations

Conflict of interest

The authors declare that they have no conflict of interest.

Additional information

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Cocoma-Ortega, J.A., Patricio, F., Limon, I.D. et al. A deep learning-based approach for real-time rodent detection and behaviour classification. Multimed Tools Appl 81, 30329–30350 (2022). https://doi.org/10.1007/s11042-022-12664-y

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11042-022-12664-y

Keywords

Navigation