Skip to main content

PigPose: A Realtime Framework for Farm Animal Pose Estimation and Tracking

  • Conference paper
  • First Online:
Artificial Intelligence Applications and Innovations (AIAI 2022)

Abstract

In industrial farming, livestock well-being is becoming increasingly more important. Animal breeding companies are interested in enhancing the total merit index used in breeding programs. Pigs tracking and behaviour analysis plays a crucial role in breeding programs. To this end, we proposed a tracking-by-detection approach for detecting and tracking indoor farm animals for an extended period. We exploited a modified OpenPose model for the detection where the features from the input frames are extracted through EfficientNet, and the detected Keypoints are associated through a greedy optimization mechanism. Additionally, the attention mechanism is incorporated in the pose estimation framework to refine the input frames’ features maps. A bipartite graph is created for every two frames to track the animals over an extended period. The edge cost is defined by the spatial distance between the detected Keypoints of the animals in the temporal domain. We collected and annotated the customized dataset from the pig farm to train the model. The dataset and annotation will be made publicly available to help promote research in the farming industry. The proposed method is evaluated on \(AP^{OKS}\) and \(AR^{OKS}\), and promising results are achieved.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 119.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 159.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 159.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Quddus Khan, A., Khan, S., Ullah, M., Cheikh, F.A.: A bottom-up approach for pig skeleton extraction using RGB data. In: El Moataz, A., Mammass, D., Mansouri, A., Nouboud, F. (eds.) ICISP 2020. LNCS, vol. 12119, pp. 54–61. Springer, Cham (2020). https://doi.org/10.1007/978-3-030-51935-3_6

    Chapter  Google Scholar 

  2. Post, M.J., et al.: Scientific, sustainability and regulatory challenges of cultured meat. Nature Food 1(7), 403–415 (2020)

    Article  Google Scholar 

  3. Herlin, A., Brunberg, E., Hultgren, J., Högberg, N., Rydberg, A., Skarin, A.: Animal welfare implications of digital tools for monitoring and management of cattle and sheep on pasture. Animals 11(3), 829 (2021)

    Article  Google Scholar 

  4. Weishaar, R., Wellmann, R., Camarinha-Silva, A., Rodehutscord, M., Bennewitz, J.: Selecting the hologenome to breed for an improved feed efficiency in pigs? A novel selection index. J. Anim. Breed. Genet. 137(1), 14–22 (2020)

    Article  Google Scholar 

  5. Ullah, M., Cheikh, F.A.: Deep feature based end-to-end transportation network for multi-target tracking. In: 2018 25th IEEE International Conference on Image Processing (ICIP), pp. 3738–3742. IEEE (2018)

    Google Scholar 

  6. Beard, M., Vo, B.T., Vo, B.N.: A solution for large-scale multi-object tracking. IEEE Trans. Signal Process. 68, 2754–2769 (2020)

    Article  MathSciNet  Google Scholar 

  7. Wang, T., Shi, C.: Basketball motion video target tracking algorithm based on improved gray neural network. Neural Comput. Appl. 1–16 (2022). https://doi.org/10.1007/s00521-022-07026-6

  8. Ullah, M., Ullah, H., Conci, N., De Natale, G.B.: Crowd behavior identification. In: IEEE International Conference on Image Processing, pp. 1195–1199 (2016)

    Google Scholar 

  9. Erol, B.A., Majumdar, A., Lwowski, J., Benavidez, P., Rad, P., Jamshidi, M.: Improved deep neural network object tracking system for applications in home robotics. In: Pedrycz, W., Chen, S.-M. (eds.) Computational Intelligence for Pattern Recognition. SCI, vol. 777, pp. 369–395. Springer, Cham (2018). https://doi.org/10.1007/978-3-319-89629-8_14

    Chapter  Google Scholar 

  10. Dhont, J., et al.: Multi-object tracking in MRI-guided radiotherapy using the tracking-learning-detection framework. Radiother. Oncol. 138, 25–29 (2019)

    Article  Google Scholar 

  11. Sa, J., Choi, Y., Lee, H., Chung, Y., Park, D., Cho, J.: Fast pig detection with a top-view camera under various illumination conditions. Symmetry 11(2), 266 (2019)

    Article  Google Scholar 

  12. Liu, Y., Sun, L., Luo, B., Chen, S., Li, Y.: Multi-target pigs detection algorithm based on improved CNN. Trans. Chin. Soc. Agric. Mach. S1 (2019)

    Google Scholar 

  13. Miso, J., et al.: A kinect-based segmentation of touching-pigs for real-time monitoring. Sensors 18(6), 1746 (2018)

    Article  Google Scholar 

  14. Redmon, J., Divvala, S., Girshick, R., Farhadi, A.: You only look once: unified, real-time object detection. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 779–788 (2016)

    Google Scholar 

  15. Brünger, J., Gentz, M., Traulsen, I., Koch, R.: Panoptic segmentation of individual pigs for posture recognition. Sensors 20(13), 3710 (2020)

    Article  Google Scholar 

  16. Zhang, L., Gray, H., Ye, X., Collins, L., Allinson, N.: Automatic individual pig detection and tracking in pig farms. Sensors 19(5), 1188 (2019)

    Article  Google Scholar 

  17. Cowton, J., Kyriazakis, I., Bacardit, J.: Automated individual pig localisation, tracking and behaviour metric extraction using deep learning. IEEE Access 7, 108049–108060 (2019)

    Article  Google Scholar 

  18. Sun, L., et al.: Multi target pigs tracking loss correction algorithm based on faster R-CNN. Int. J. Agric. Biol. Eng. 11(5), 192–197 (2018)

    Google Scholar 

  19. Li, D., Chen, Y., Zhang, K., Li, Z.: Mounting behaviour recognition for pigs based on deep learning. Sensors 19(22), 4924 (2019)

    Article  Google Scholar 

  20. He, K., Gkioxari, G., Dollár, P., Girshick, R.: Mask R-CNN. In: Proceedings of the IEEE International Conference on Computer Vision, pp. 2961–2969 (2017)

    Google Scholar 

  21. Meinhardt, T., Kirillov, A., Leal-Taixe, L., Feichtenhofer, C.: Trackformer: multi-object tracking with transformers. arXiv preprint arXiv:2101.02702 (2021)

  22. Ullah, M., Cheikh, F.A., Imran, A.S.: Hog based real-time multi-target tracking in Bayesian framework. In: IEEE International Conference on Advanced Video and Signal Based Surveillance, pp. 416–422 (2016)

    Google Scholar 

  23. Hung, W.-C., et al.: Soda: multi-object tracking with soft data association. arXiv preprint arXiv:2008.07725 (2020)

  24. Ullah, M., Mohammed, A.K., Cheikh, F.A., Wang, Z.: A hierarchical feature model for multi-target tracking. In: IEEE International Conference on Image Processing, pp. 2612–2616 (2017)

    Google Scholar 

  25. Bai, Y., Zhang, Y., Ding, M., Ghanem, B.: Sod-mtgan: small object detection via multi-task generative adversarial network. In: Proceedings of the European Conference on Computer Vision (ECCV), pp. 206–221 (2018)

    Google Scholar 

  26. Tan, M., Pang, R., Le, Q.V.: Efficientdet: scalable and efficient object detection. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 10781–10790 (2020)

    Google Scholar 

  27. Tan, M., Le, Q.: Efficientnet: rethinking model scaling for convolutional neural networks. In: International Conference on Machine Learning, pp. 6105–6114. PMLR (2019)

    Google Scholar 

  28. Cao, Z., Hidalgo, G., Simon, T., Wei, S.-E., Sheikh, Y.: Openpose: realtime multi-person 2D pose estimation using part affinity fields. IEEE Trans. Pattern Anal. Mach. Intell. 43(1), 172–186 (2019)

    Article  Google Scholar 

  29. Woo, S., Park, J., Lee, J.Y., Kweon, I.S.: CBAM: convolutional block attention module. In: Proceedings of the European Conference on Computer Vision (ECCV), pp. 3–19 (2018)

    Google Scholar 

  30. Ullah, M., Cheikh, F.A.: A directed sparse graphical model for multi-target tracking. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition Workshops, pp. 1816–1823 (2018)

    Google Scholar 

  31. Lin, J.: Divergence measures based on the shannon entropy. IEEE Trans. Inf. Theory 37(1), 145–151 (1991)

    Article  MathSciNet  Google Scholar 

Download references

Acknowledgment

We would like to thank Norsvin SA for sharing data and the Research Council of Norway for funding this study, within the BIONÆR program, project numbers 282252 and 321409. In special, we would also like to thank Rune Sagevik, Norsvin SA for the image acquisition.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Mohib Ullah .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2022 IFIP International Federation for Information Processing

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Kresovic, M., Nguyen, T., Ullah, M., Afridi, H., Cheikh, F.A. (2022). PigPose: A Realtime Framework for Farm Animal Pose Estimation and Tracking. In: Maglogiannis, I., Iliadis, L., Macintyre, J., Cortez, P. (eds) Artificial Intelligence Applications and Innovations. AIAI 2022. IFIP Advances in Information and Communication Technology, vol 646. Springer, Cham. https://doi.org/10.1007/978-3-031-08333-4_17

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-08333-4_17

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-08332-7

  • Online ISBN: 978-3-031-08333-4

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics