Abstract
Computational Ethology provides automated and precise measurement of animal behavior. Artificial Intelligence (AI) techniques have also introduced the enhanced capabilities to interpret experimental data in order to extract accurate ethograms allowing the comparison of animal models with high discriminative power. In this short review we introduce the most recent software tools that employ AI tools for this endeavor, including the popular deep learning approaches.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Abbasi, R., Balazs, P., Marconi, M.A., Nicolakis, D., Zala, S.M., Penn, D.J.: Capturing the songs of mice with an improved detection and classification method for ultrasonic vocalizations (bootsnap). PLoS Comput. Biol. 18(5), e1010049 (2022). https://doi.org/10.1371/journal.pcbi.1010049
Ajuwon, V., Cruz, B.F., Carriço, P., et al.: GoFish: a low-cost, open-source platform for closed-loop behavioural experiments on fish. Behav. Res. (2023). https://doi.org/10.3758/s13428-022-02049-2
Akiti, K., et al.: Striatal dopamine explains novelty-induced behavioral dynamics and individual variability in threat prediction. Neuron 110(22), 3789–3804 (2022). https://doi.org/10.1016/j.neuron.2022.08.022
Aldoumani, N., Meydan, T., Dillingham, C.M., Erichsen, J.T.: Enhanced tracking system based on micro inertial measurements unit to measure sensorimotor responses in pigeons. IEEE Sens. J. 16(24), 8847–8853 (2016). https://doi.org/10.1109/JSEN.2016.2586540
Anderson, D., Perona, P.: Toward a science of computational ethology. Neuron 84(1), 18–31 (2014). https://doi.org/10.1016/j.neuron.2014.09.005
Arvin, S., Rasmussen, R.N., Yonehara, K.: Eyeloop: An open-source system for high-speed, closed-loop eye-tracking. Front. Cell. Neurosci. 15, 779628 (2021). https://doi.org/10.3389/fncel.2021.779628
Bala, P.C., Eisenreich, B.R., Yoo, S.B.M., Hayden, B.Y., Park, H.S., Zimmermann, J.: Automated markerless pose estimation in freely moving macaques with openmonkeystudio. Nat. Commun. 11(1), 4560 (2020). https://doi.org/10.1038/s41467-020-18441-5
Berman, G., Choi, D., Bialek, W., Shaevitz, J.: Mapping the stereotyped behaviour of freely moving fruit flies. J. Royal Soc. Interface 11(99) (2014). https://doi.org/10.1098/rsif.2014.0672
Bohnslav, J., et al.: Deepethogram, a machine learning pipeline for supervised behavior classification from raw pixels. eLife 10 (2021). https://doi.org/10.7554/eLife.63377
Bova, A., Kernodle, K., Mulligan, K., Leventhal, D.: Automated rat single-pellet reaching with 3-dimensional reconstruction of paw and digit trajectories. J. Vis. Exp. 2019(149) (2019). https://doi.org/10.3791/59979
Carreño-Munoz, M., et al.: Potential involvement of impaired bk ca channel function in sensory defensiveness and some behavioral disturbances induced by unfamiliar environment in a mouse model of fragile x syndrome. Neuropsychopharmacology 43(3), 492–502 (2018). https://doi.org/10.1038/npp.2017.149
Carreño-Muñoz, M., et al.: Detecting fine and elaborate movements with piezo sensors provides non-invasive access to overlooked behavioral components. Neuropsychopharmacology 47(4), 933–943 (2022). https://doi.org/10.1038/s41386-021-01217-w
Chaput, S.L., Burggren, W.W., Hurd, P.L., Hamilton, T.J.: Zebrafish (danio rerio) shoaling in light and dark conditions involves a complex interplay between vision and lateral line. Behav. Brain Res. 439, 114228 (2023)
Chen, C.P.J., Morota, G., Lee, K., Zhang, Z., Cheng, H.: Vtag: a semi-supervised pipeline for tracking pig activity with a single top-view camera. J. Animal Sci. 100 (2022)
Chen, G., Li, C., Guo, Y., Shu, H., Cao, Z., Xu, B.: Recognition of cattle’s feeding behaviors using noseband pressure sensor with machine learning. Front. Veterinary Sci. 9, 822621 (2022)
Choi, S., et al.: Parallel ascending spinal pathways for affective touch and pain. Nature 587(7833), 258–263 (2020). https://doi.org/10.1038/s41586-020-2860-1
Clemensson, E.K.H., Abbaszadeh, M., Fanni, S., Espa, E., Cenci, M.A.: Tracking rats in operant conditioning chambers using a versatile homemade video camera and deeplabcut. J. Vis. Exp. (160) (2020). https://doi.org/10.3791/61409
Coffey, K., Marx, R., Neumaier, J.: Deepsqueak: a deep learning-based system for detection and analysis of ultrasonic vocalizations. Neuropsychopharmacology 44(5), 859–868 (2019). https://doi.org/10.1038/s41386-018-0303-6
Cristancho, A.G., Tulina, N., Brown, A.G., Anton, L., Barila, G., Elovitz, M.A.: Intrauterine inflammation leads to select sex- and age-specific behavior and molecular differences in mice. Int. J. Molec. Sci. 24 (2022)
Datta, S., Anderson, D., Branson, K., Perona, P., Leifer, A.: Computational neuroethology: A call to action. Neuron 104(1), 11–24 (2019). https://doi.org/10.1016/j.neuron.2019.09.038
De Almeida, T., Spinelli, B., Hypolito Lima, R., Gonzalez, M., Rodrigues, A.: Pyrat: An open-source python library for animal behavior analysis. Front. Neurosci. 16 (2022). https://doi.org/10.3389/fnins.2022.779106
Doornweerd, J.E., et al.: Passive radio frequency identification and video tracking for the determination of location and movement of broilers. Poult. Sci. 102, 102412 (2022)
Drazan, J.F., Phillips, W.T., Seethapathi, N., Hullfish, T.J., Baxter, J.R.: Moving outside the lab: Markerless motion capture accurately quantifies sagittal plane kinematics during the vertical jump. J. Biomech. 125, 110547 (2021). https://doi.org/10.1016/j.jbiomech.2021.110547
Feng, J., Xiao, X.: Multiobject tracking of wildlife in videos using few-shot learning. Animals: Open Access J. MDPI 12 (2022)
Fujiwara, T., Brotas, M., Chiappe, M.: Walking strides direct rapid and flexible recruitment of visual circuits for course control in drosophila. Neuron 110(13), 2124-2138.e8 (2022). https://doi.org/10.1016/j.neuron.2022.04.008
Gabriel, C.J., et al.: Behaviordepot is a simple, flexible tool for automated behavioral detection based on markerless pose tracking. Elife 11 (2022). https://doi.org/10.7554/eLife.74314
Gaidica, M., Dantzer, B.: An implantable neurophysiology platform: Broadening research capabilities in free-living and non-traditional animals. Front. Neural Circ. 16, 940989 (2022)
Geelen, J.E., Branco, M.P., Ramsey, N.F., van der Helm, F.C.T., Mugge, W., Schouten, A.C.: Markerless motion capture: Ml-mocap, a low-cost modular multi-camera setup. Annu. Int. Conf. IEEE. Eng. Med. Biol. Soc. 2021, 4859–4862 (2021). https://doi.org/10.1109/EMBC46164.2021.9629749
Gerós, A., Magalhães, A., Aguiar, P.: Improved 3d tracking and automated classification of rodents’ behavioral activity using depth-sensing cameras. Behav. Res. Methods 52(5), 2156–2167 (2020). https://doi.org/10.3758/s13428-020-01381-9
Gomez-Marin, A.: A clash of umwelts: Anthropomorphism in behavioral neuroscience. Behav. Brain Sci. 42, e229 (2019). https://doi.org/10.1017/S0140525X19001237
Goncharow, P.N., Beaudette, S.M.: Assessing time-varying lumbar flexion-extension kinematics using automated pose estimation. J. Appl. Biomech. 38(5), 355–360 (2022). https://doi.org/10.1123/jab.2022-0041
Graving, J., et al.: Deepposekit, a software toolkit for fast and robust animal pose estimation using deep learning. eLife 8 (2019). https://doi.org/10.7554/eLife.47994
Henriques, S., et al.: Metabolic cross-feeding in imbalanced diets allows gut microbes to improve reproduction and alter host behaviour. Nat. Commun. 11(1) (2020). https://doi.org/10.1038/s41467-020-18049-9
Hood, K.E., Long, E., Navarro, E., Hurley, L.M.: Playback of broadband vocalizations of female mice suppresses male ultrasonic calls. PLoS ONE 18, e0273742 (2023)
Hrvatin, S., et al.: Neurons that regulate mouse torpor. Nature 583(7814), 115–121 (2020). https://doi.org/10.1038/s41586-020-2387-5
Hsu, A., Yttri, E.: B-soid, an open-source unsupervised algorithm for identification and fast prediction of behaviors. Nat. Commun. 12(1) (2021). https://doi.org/10.1038/s41467-021-25420-x
Huang, G., Liu, Z., Van Der Maaten, L., Weinberger, K.: Densely connected convolutional networks, pp. 2261–2269 (January 2017). https://doi.org/10.1109/CVPR.2017.243
Hurley, M., et al.: Adolescent female rats recovered from the activity-based anorexia display blunted hedonic responding. Int. J. Eating Disorders 55(8), 1042–1053 (2022). https://doi.org/10.1002/eat.23752
Jia, Y., et al.: Selfee, self-supervised features extraction of animal behaviors. eLife 11 (2022). https://doi.org/10.7554/eLife.76218
Jin, T., Duan, F.: Rat behavior observation system based on transfer learning. IEEE Access 7, 62152–62162 (2019). https://doi.org/10.1109/ACCESS.2019.2916339
Kabra, M., Robie, A., Rivera-Alba, M., Branson, S., Branson, K.: Jaaba: Interactive machine learning for automatic annotation of animal behavior. Nat. Methods 10(1), 64–67 (2013). https://doi.org/10.1038/nmeth.2281
Karashchuk, P., et al.: Anipose: A toolkit for robust markerless 3d pose estimation. Cell Rep. 36(13), 109730 (2021). https://doi.org/10.1016/j.celrep.2021.109730
Kempermann, G., et al.: The individuality paradigm: Automated longitudinal activity tracking of large cohorts of genetically identical mice in an enriched environment. Neurobiol. Dis. 175, 105916 (2022)
Kirkpatrick, N.J., Butera, R.J., Chang, Y.H.: Deeplabcut increases markerless tracking efficiency in x-ray video analysis of rodent locomotion. J. Exp. Biol. 225(16) (2022). https://doi.org/10.1242/jeb.244540
Kume, M., Yoshikawa, Y., Tanaka, T., Watanabe, S., Mitamura, H., Yamashita, Y.: Water temperature and precipitation stimulate small-sized japanese eels to climb a low-height vertical weir. PLoS ONE 17, e0279617 (2022)
Kuramoto, E., et al.: Development of a system to analyze oral frailty associated with alzheimer’s disease using a mouse model. Front. Aging Neurosci. 14 (2022). https://doi.org/10.3389/fnagi.2022.935033
Lauer, J., et al.: Multi-animal pose estimation, identification and tracking with deeplabcut. Nat. Methods 19(4), 496–504 (2022). https://doi.org/10.1038/s41592-022-01443-0
Li, J., Kells, P., Osgood, A., Gautam, S., Shew, W.: Collapse of complexity of brain and body activity due to excessive inhibition and mecp2 disruption. Proc. National Acad. Sci. United States Am. 118(43) (2021). https://doi.org/10.1073/pnas.2106378118
Li, Y., et al.: A novel open-source raspberry pi-based behavioral testing in zebrafish. PLoS ONE 17, e0279550 (2022)
Lonini, L., et al.: Video-based pose estimation for gait analysis in stroke survivors during clinical assessments: A proof-of-concept study. Digit Biomark 6(1), 9–18 (2022). https://doi.org/10.1159/000520732
Lopes, G., Monteiro, P.: New open-source tools: Using bonsai for behavioral tracking and closed-loop experiments. Front. Behav. Neurosci. 15 (2021). https://doi.org/10.3389/fnbeh.2021.647640
Luxem, K., et al.: Identifying behavioral structure from deep variational embeddings of animal motion. Communications Biology 5(1) (2022). https://doi.org/10.1038/s42003-022-04080-7
Magaju, D., Montgomery, J., Franklin, P., Baker, C., Friedrich, H.: Machine learning based assessment of small-bodied fish tracking to evaluate spoiler baffle fish passage design. J. Environ. Manage. 325(Pt A), 116507 (2023). https://doi.org/10.1016/j.jenvman.2022.116507
Marcus, A.D., Achanta, S., Jordt, S.E.: Protocol for non-invasive assessment of spontaneous movements of group-housed animals using remote video monitoring. STAR Protocols 3, 101326 (2022)
Marks, M., et al.: Deep-learning based identification, tracking, pose estimation, and behavior classification of interacting primates and mice in complex environments. Nat. Mach. Intell. 4, 331–340 (2022)
Marshall, J.D., Li, T., Wu, J.H., Dunn, T.W.: Leaving flatland: Advances in 3d behavioral measurement. Curr. Opin. Neurobiol. 73, 102522 (2022)
Mathis, A., et al.: Deeplabcut: markerless pose estimation of user-defined body parts with deep learning. Nat. Neurosci. 21(9), 1281–1289 (2018). https://doi.org/10.1038/s41593-018-0209-y
Narayanan, D.Z., Takahashi, D.Y., Kelly, L.M., Hlavaty, S.I., Huang, J., Ghazanfar, A.A.: Prenatal development of neonatal vocalizations. eLife 11 (2022)
Needham, L., et al.: The accuracy of several pose estimation methods for 3d joint centre localisation. Sci. Rep. 11(1), 20673 (2021). https://doi.org/10.1038/s41598-021-00212-x
Neunuebel, J., Taylor, A., Arthur, B., Roian Egnor, S.: Female mice ultrasonically interact with males during courtship displays. eLife 4, 1–24 (2015). https://doi.org/10.7554/eLife.06203
Parker, P.R.L., et al.: Distance estimation from monocular cues in an ethological visuomotor task. Elife 11 (2022). https://doi.org/10.7554/eLife.74708,https://doi.org/10.7554%2Felife.74708
Parmiani, P., Lucchetti, C., Bonifazzi, C., Franchi, G.: A kinematic study of skilled reaching movement in rat. J. Neurosci. Methods 328, 108404 (2019). https://doi.org/10.1016/j.jneumeth.2019.108404
Pereira, T.D., et al.: Sleap: A deep learning system for multi-animal pose tracking. Nat. Methods 19, 486–495 (2022)
Pereira, T., et al.: Fast animal pose estimation using deep neural networks. Nature Methods 16(1), 117–125 (2019). https://doi.org/10.1038/s41592-018-0234-5
Pons, P., Jaen, J., Catala, A.: Assessing machine learning classifiers for the detection of animals’ behavior using depth-based tracking. Expert Systems with Applications 86, 235–246 (2017). https://doi.org/10.1016/j.eswa.2017.05.063, https://www.sciencedirect.com/science/article/pii/S0957417417303913
Popov, A., et al.: A high-fat diet changes astrocytic metabolism to promote synaptic plasticity and behavior. Acta Physiologica 236(1) (2022). https://doi.org/10.1111/apha.13847
Rodrigues, D., et al.: Chronic stress causes striatal disinhibition mediated by som-interneurons in male mice. Nat. Commun. 13(1) (2022). https://doi.org/10.1038/s41467-022-35028-4
Sangarapillai, N., Wöhr, M., Schwarting, R.K.W.: Appetitive 50 khz calls in a pavlovian conditioned approach task in cacna1c haploinsufficient rats. Phys. Behav. 250, 113795 (2022)
Segalin, C., et al.: The mouse action recognition system (mars) software pipeline for automated analysis of social behaviors in mice. eLife 10 (2021). https://doi.org/10.7554/eLife.63720
Sturman, O., et al.: Deep learning-based behavioral analysis reaches human accuracy and is capable of outperforming commercial solutions. Neuropsychopharmacology 45(11), 1942–1952 (2020). https://doi.org/10.1038/s41386-020-0776-y
Su, F., et al.: Noninvasive tracking of every individual in unmarked mouse groups using multi-camera fusion and deep learning. Neurosci. Bull. (2022)
Suryanto, M.E., et al.: Using deeplabcut as a real-time and markerless tool for cardiac physiology assessment in zebrafish. Biology (Basel) 11(8) (2022). https://doi.org/10.3390/biology11081243
Takenaka, M., et al.: Behavior of snow monkeys hunting fish to survive winter. Sci. Rep. 12, 20324 (2022)
Tarcsay, G., Boublil, B., Ewell, L.: Low-cost platform for multianimal chronic local field potential video monitoring with graphical user interface (gui) for seizure detection and behavioral scoring. eNeuro 9(5) (2022). https://doi.org/10.1523/ENEURO.0283-22.2022
Venkatraman, S., Jin, X., Costa, R., Carmena, J.: Investigating neural correlates of behavior in freely behaving rodents using inertial sensors. Journal of Neurophysiology 104(1), 569–575 (2010). https://doi.org/10.1152/jn.00121.2010
Vester, H., Hammerschmidt, K., Timme, M., Hallerberg, S.: Quantifying group specificity of animal vocalizations without specific sender information. Phys. Rev. E 93(2), 022138 (2016). https://doi.org/10.1103/PhysRevE.93.022138
Vonstad, E.K., Su, X., Vereijken, B., Bach, K., Nilsen, J.H.: Comparison of a deep learning-based pose estimation system to marker-based and kinect systems in exergaming for balance training. Sensors (Basel) 20(23) (2020). https://doi.org/10.3390/s20236940
Wang, J., Karbasi, P., Wang, L., Meeks, J.P.: A layered, hybrid machine learning analytic workflow for mouse risk assessment behavior. eNeuro (2022). https://doi.org/10.1523/ENEURO.0335-22.2022
Weber, R.Z., Mulders, G., Kaiser, J., Tackenberg, C., Rust, R.: Deep learning-based behavioral profiling of rodent stroke recovery. BMC Biol. 20(1), 232 (2022). https://doi.org/10.1186/s12915-022-01434-9
Whiteway, M.R.: Partitioning variability in animal behavioral videos using semi-supervised variational autoencoders. PLoS Comput. Biol. 17(9), e1009439 (2021). https://doi.org/10.1371/journal.pcbi.1009439
Wrench, A., Balch-Tomes, J.: Beyond the edge: Markerless pose estimation of speech articulators from ultrasound and camera images using deeplabcut. Sensors (Basel) 22(3) (2022). https://doi.org/10.3390/s22031133
Acknowledgments
This work has been partially supported by FEDER funds through MINECO project TIN2017-85827-P, and grant IT1284-19 as university research group of excellence from the Basque Government.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
Aguilar-Moreno, M., Graña, M. (2023). Computational Ethology: Short Review of Current Sensors and Artificial Intelligence Based Methods. In: Iliadis, L., Maglogiannis, I., Alonso, S., Jayne, C., Pimenidis, E. (eds) Engineering Applications of Neural Networks. EANN 2023. Communications in Computer and Information Science, vol 1826. Springer, Cham. https://doi.org/10.1007/978-3-031-34204-2_2
Download citation
DOI: https://doi.org/10.1007/978-3-031-34204-2_2
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-34203-5
Online ISBN: 978-3-031-34204-2
eBook Packages: Computer ScienceComputer Science (R0)