skip to main content
10.1145/3582099.3582115acmotherconferencesArticle/Chapter ViewAbstractPublication PagesaicccConference Proceedingsconference-collections
research-article

Affect recognition using simplistic 2D skeletal features from the upper body movement

Published:20 April 2023Publication History

ABSTRACT

Over the past two decades, affective computing has garnered considerable attention. However, affective computing using body modality is still in its initial stages. Body affect detection using 3D skeletal data or motion capture data has seen some progress and produced promising results, but such advancement using RGB videos is yet to be achieved. In this paper, using OpenPose, 2D skeletal data is extracted from RGB videos. Joint location and joint angle features from MPIIEmo and GEMEP datasets are used to efficiently recognize affective states of angry, happy, sad, and surprise.

References

  1. A. Bruce, “Emotional Expression,” Am. Nat., vol. 17, pp. 613–617, 1883.Google ScholarGoogle ScholarCross RefCross Ref
  2. C. Darwin, “The expression of the emotions in Man and Aninalms,” London J. Murray, pp. 183–204, 1872, doi: 10.1037/10025-007.Google ScholarGoogle Scholar
  3. R. W. Picard, “Affective Computing,” MIT Press, 1997, doi: 10.1109/T-AFFC.2010.10.Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. Y. Wang , “A systematic review on affective computing: emotion models , databases , and recent advances,” Inf. Fusion, vol. 83–84, no. July 2021, pp. 19–52, 2022, doi: 10.1016/j.inffus.2022.03.009.Google ScholarGoogle ScholarCross RefCross Ref
  5. S. Poria, E. Cambria, R. Bajpai, and A. Hussain, “A review of affective computing: From unimodal analysis to multimodal fusion,” Inf. Fusion, vol. 37, pp. 98–125, 2017, doi: 10.1016/j.inffus.2017.02.003.Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. P. Ekman, “Basic Emotions,” IT. Dalgleish M. Power (Eds.). Handb. Cogn. Emot. Sussex, U.K. John Wiley Sons, Ltd., 1999, no. 1992, 1999.Google ScholarGoogle ScholarCross RefCross Ref
  7. A. S. Cowen and D. Keltner, “Self-report captures 27 distinct categories of emotion bridged by continuous gradients,” 2017, doi: 10.1073/pnas.1702247114/-/DCSupplemental.www.pnas.org/cgi/doi/10.1073/pnas.1702247114.Google ScholarGoogle ScholarCross RefCross Ref
  8. S. Baloch, S. A. R. S. A. Bakar, M. M. Mokji, and S. Waseem, “Affect Recognition Using Dynamic Characteristics of Motion,” in 2021 IEEE International Conference on Signal and Image Processing Applications (ICSIPA), 2021, pp. 55–59, doi: 10.1109/ICSIPA52582.2021.9576772.Google ScholarGoogle ScholarCross RefCross Ref
  9. R. A. Calvo and S. D'Mello, “Affect detection: An interdisciplinary review of models, methods, and their applications,” IEEE Trans. Affect. Comput., vol. 1, no. 1, pp. 18–37, 2010, doi: 10.1109/T-AFFC.2010.1.Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. H. Al Osman and T. H. Falk, “Multimodal Affect Recognition: Current Approaches and Challenges,” Emot. Atten. Recognit. Based Biol. Signals Images, 2017, doi: 10.5772/65683.Google ScholarGoogle ScholarCross RefCross Ref
  11. A. S. Patwardhan, “Hostile behavior detection from multiple view points using RGB-D sensor,” 2017 IEEE SmartWorld Ubiquitous Intell. Comput. Adv. Trust. Comput. Scalable Comput. Commun. Cloud Big Data Comput. Internet People Smart City Innov. SmartWorld/SCALCOM/UIC/ATC/CBDCom/IOP/SCI 2017 - , pp. 1–6, 2018, doi: 10.1109/UIC-ATC.2017.8397461.Google ScholarGoogle Scholar
  12. C. Corneanu, F. Noroozi, D. Kaminska, T. Sapinski, S. Escalera, and G. Anbarjafari, “Survey on Emotional Body Gesture Recognition,” IEEE Trans. Affect. Comput., vol. 12, no. 2, pp. 505–523, 2021, doi: 10.1109/TAFFC.2018.2874986.Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. T. Sapiński, D. Kamińska, A. Pelikant, and G. Anbarjafari, “Emotion recognition from skeletal movements,” Entropy, vol. 21, no. 7, pp. 1–16, 2019, doi: 10.3390/e21070646.Google ScholarGoogle ScholarCross RefCross Ref
  14. G. Castellano, S. D. Villalba, and A. Camurri, “Recognising human emotions from body movement and gesture dynamics,” in Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 2007, vol. 4738 LNCS, pp. 71–82, doi: 10.1007/978-3-540-74889-2_7.Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. H. Gunes and M. Piccardi, “Bimodal face and body gesture database for automatic analysis of human nonverbal affective behavior,” in Proceedings - International Conference on Pattern Recognition, 2006, vol. 1, no. January 2006, pp. 1148–1153, doi: 10.1109/ICPR.2006.39.Google ScholarGoogle ScholarDigital LibraryDigital Library
  16. T. Banziger, M. Mortillaro, and K. R. Scherer, “Introducing the Geneva Multimodal Expression Corpus for Experimental Research on Emotion Perception,” Emotion, vol. 12, no. 5, pp. 1161–1179, 2012, doi: 10.1037/a0025827.Google ScholarGoogle ScholarCross RefCross Ref
  17. P. M. Muller, S. Amin, P. Verma, M. Andriluka, and A. Bulling, “Emotion recognition from embedded bodily expressions and speech during dyadic interactions,” in 2015 International Conference on Affective Computing and Intelligent Interaction, ACII 2015, 2015, pp. 663–669, doi: 10.1109/ACII.2015.7344640.Google ScholarGoogle ScholarDigital LibraryDigital Library
  18. D. Avola, L. Cinque, A. Fagioli, G. L. Foresti, and C. Massaroni, “Deep temporal analysis for non-acted body affect recognition,” arXiv, vol. 3045, no. c, pp. 1–12, 2019, doi: 10.1109/taffc.2020.3003816.Google ScholarGoogle Scholar
  19. P. V Rouast, M. Adam, and R. Chiong, “Deep Learning for Human Affect Recognition: Insights and New Developments,” IEEE Trans. Affect. Comput., vol. 14, no. 8, pp. 1–20, 2018, doi: 10.1109/TAFFC.2018.2890471.Google ScholarGoogle ScholarDigital LibraryDigital Library
  20. A. Camurri, B. Mazzarino, M. Ricchetti, R. Timmers, and G. Volpe, “Multimodal analysis of expressive gesture in music and dance performances,” in Gesture-Based Communication in Human-Computer Interaction, vol. 2915, no. April, 2003, pp. 409–420.Google ScholarGoogle Scholar
  21. D. Bernhardt and P. Robinson, “Detecting affect from non-stylised body motions,” in Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 2007, vol. 4738 LNCS, pp. 59–70, doi: 10.1007/978-3-540-74889-2_6.Google ScholarGoogle ScholarDigital LibraryDigital Library
  22. C. Shan, S. Gong, and P. W. McOwan, “Beyond Facial Expressions: Learning Human Emotion from Body Gestures. BT - Proceedings of the British Machine Vision Conference 2007, University of Warwick, UK, September 10-13, 2007.” pp. 1–10, 2007, doi: 10.5244/C.21.43.Google ScholarGoogle ScholarCross RefCross Ref
  23. G. Castellano, M. Mortillaro, A. Camurri, G. Volpe, and K. Scherer, “Automated analysis of body movement in emotionally expressive piano performances,” Music Percept., vol. 26, no. 2, pp. 103–119, 2008, doi: 10.1525/mp.2008.26.2.103.Google ScholarGoogle ScholarCross RefCross Ref
  24. D. Glowinski, A. Camurri, G. Volpe, N. Dael, and K. Scherer, “Technique for automatic emotion recognition by body gesture analysis,” in 2008 IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops, CVPR Workshops, 2008, no. June, doi: 10.1109/CVPRW.2008.4563173.Google ScholarGoogle ScholarCross RefCross Ref
  25. H. Gunes, M. Piccardi, and S. Member, “Automatic Temporal Segment Detection and Affect Recognition From Face and Body Display,” IEEE Trans. Syst. Man, Cybern. Part B, vol. 39, no. 1, pp. 64–84, 2009, doi: 10.1109/TSMCB.2008.927269.Google ScholarGoogle ScholarDigital LibraryDigital Library
  26. D. Glowinski, N. Dael, A. Camurri, G. Volpe, M. Mortillaro, and K. Scherer, “Toward a minimal representation of affective gestures,” IEEE Trans. Affect. Comput., vol. 2, no. 2, pp. 106–118, 2011, doi: 10.1109/T-AFFC.2011.7.Google ScholarGoogle ScholarDigital LibraryDigital Library
  27. S. Chen, Y. Tian, Q. Liu, and D. N. Metaxas, “Recognizing expressions from face and body gesture by temporal normalized motion and appearance features ☆,” IMAVIS, vol. 31, no. 2, pp. 175–185, 2013, doi: 10.1016/j.imavis.2012.06.014.Google ScholarGoogle ScholarDigital LibraryDigital Library
  28. H. Gunes, C. Shan, S. Chen, and Y. Tian, “BODILY EXPRESSION FOR AUTOMATIC AFFECT RECOGNITION–1,” 2015.Google ScholarGoogle ScholarCross RefCross Ref
  29. Z. Shen, J. Cheng, X. Hu, and Q. Dong, “Emotion Recognition Based on Multi-View Body Gestures,” 2019 IEEE Int. Conf. Image Process., pp. 3317–3321, 2019.Google ScholarGoogle ScholarCross RefCross Ref
  30. A. Kapur, A. Kapur, N. Virji-Babul, G. Tzanetakis, and P. F. Driessen, “Gesture-based affective computing on motion capture data,” Lect. Notes Comput. Sci. (including Subser. Lect. Notes Artif. Intell. Lect. Notes Bioinformatics), vol. 3784 LNCS, pp. 1–7, 2005, doi: 10.1007/11573548_1.Google ScholarGoogle ScholarDigital LibraryDigital Library
  31. M. Garber-Barron and M. Si, “Using body movement and posture for emotion detection in non-acted scenarios,” in IEEE International Conference on Fuzzy Systems, 2012, doi: 10.1109/FUZZ-IEEE.2012.6250780.Google ScholarGoogle ScholarCross RefCross Ref
  32. A. Kleinsmith, N. Bianchi-Berthouze, and A. Steed, “Automatic Recognition of Non-Acted Affective Postures,” CYBERNETICS, vol. 41, no. 4, p. 1027, 2011, doi: 10.1109/TSMCB.2010.2103557.Google ScholarGoogle ScholarDigital LibraryDigital Library
  33. W. Wang, V. Enescu, and H. Sahli, “Towards Real-Time Continuous Emotion Recognition from Body Movements,” Hum. Behav. Underst. 2013. Lect. Notes Comput. Sci. vol 8212. Springer, Cham, pp. 235–245, 2013.Google ScholarGoogle ScholarDigital LibraryDigital Library
  34. W. Wang, V. Enescu, and H. Sahli, “Adaptive real-time emotion recognition from body movements,” ACM Trans. Interact. Intell. Syst., vol. 5, no. 4, 2015, doi: 10.1145/2738221.Google ScholarGoogle ScholarDigital LibraryDigital Library
  35. S. Piana, A. Staglianò, F. Odone, A. Verri, and A. Camurri, “Real-time Automatic Emotion Recognition from Body Gestures,” 2014.Google ScholarGoogle Scholar
  36. S. Piana, A. Stagliańo, F. Odone, and A. Camurri, “Adaptive Body Gesture Representation for Automatic Emotion Recognition,” ACM Trans. Interact. Intell. Syst., vol. 6, no. 1, 2016, doi: 10.1145/2818740.Google ScholarGoogle ScholarDigital LibraryDigital Library
  37. K. Kaza , “Body motion analysis for emotion recognition in serious games,” in Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 2016, vol. 9738, pp. 33–42, doi: 10.1007/978-3-319-40244-4_4.Google ScholarGoogle Scholar
  38. A. Kleinsmith and N. Bianchi-berthouze, “Affective Body Expression Perception and Recognition: A Survey,” IEEE Trans. Affect. Comput., vol. 4, no. 1, pp. 15–33, 2013, doi: 10.1109/T-AFFC.2012.16.Google ScholarGoogle ScholarDigital LibraryDigital Library
  39. B. De Gelder, “Why bodies? Twelve reasons for including bodily expressions in affective neuroscience,” Philosophical Transactions of the Royal Society B: Biological Sciences, vol. 364, no. 1535. pp. 3475–3484, 2009, doi: 10.1098/rstb.2009.0190.Google ScholarGoogle ScholarCross RefCross Ref
  40. H. Zacharatos, C. Gatzoulis, and Y. Chrysanthou, “Automatic Emotion Recognition based on Body Movement Analysis: A Survey,” IEEE Comput. Graph. Appl., vol. 34, no. 6, pp. 35–45, 2014, doi: doi: 10.1109/MCG.2014.106.Google ScholarGoogle ScholarCross RefCross Ref
  41. Z. Cao, G. Hidalgo, T. Simon, S.-E. Wei, and Y. Sheikh, “OpenPose: Realtime Multi-Person 2D Pose Estimation Using Part Affinity Fields.,” IEEE Trans. Pattern Anal. Mach. Intell., vol. 43, no. 1, pp. 172–186, Jan. 2021, doi: 10.1109/TPAMI.2019.2929257.Google ScholarGoogle ScholarDigital LibraryDigital Library
  42. N. U. R. Malik, S. A. R. A. Bakar, and U. U. Sheikh, “A Simplified Skeleton Joints Based Approach For Human Action Recognition,” in 2021 IEEE International Conference on Signal and Image Processing Applications (ICSIPA), 2021, pp. 72–76, doi: 10.1109/ICSIPA52582.2021.9576770.Google ScholarGoogle ScholarCross RefCross Ref
  43. A. Patwardhan and G. Knapp, “Aggressive actions and anger detection from multiple modalities using Kinect,” 2016.Google ScholarGoogle Scholar

Index Terms

  1. Affect recognition using simplistic 2D skeletal features from the upper body movement

        Recommendations

        Comments

        Login options

        Check if you have access through your login credentials or your institution to get full access on this article.

        Sign in
        • Published in

          cover image ACM Other conferences
          AICCC '22: Proceedings of the 2022 5th Artificial Intelligence and Cloud Computing Conference
          December 2022
          302 pages
          ISBN:9781450398749
          DOI:10.1145/3582099

          Copyright © 2022 ACM

          Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

          Publisher

          Association for Computing Machinery

          New York, NY, United States

          Publication History

          • Published: 20 April 2023

          Permissions

          Request permissions about this article.

          Request Permissions

          Check for updates

          Qualifiers

          • research-article
          • Research
          • Refereed limited
        • Article Metrics

          • Downloads (Last 12 months)47
          • Downloads (Last 6 weeks)6

          Other Metrics

        PDF Format

        View or Download as a PDF file.

        PDF

        eReader

        View online with eReader.

        eReader

        HTML Format

        View this article in HTML Format .

        View HTML Format