ABSTRACT
Over the past two decades, affective computing has garnered considerable attention. However, affective computing using body modality is still in its initial stages. Body affect detection using 3D skeletal data or motion capture data has seen some progress and produced promising results, but such advancement using RGB videos is yet to be achieved. In this paper, using OpenPose, 2D skeletal data is extracted from RGB videos. Joint location and joint angle features from MPIIEmo and GEMEP datasets are used to efficiently recognize affective states of angry, happy, sad, and surprise.
- A. Bruce, “Emotional Expression,” Am. Nat., vol. 17, pp. 613–617, 1883.Google ScholarCross Ref
- C. Darwin, “The expression of the emotions in Man and Aninalms,” London J. Murray, pp. 183–204, 1872, doi: 10.1037/10025-007.Google Scholar
- R. W. Picard, “Affective Computing,” MIT Press, 1997, doi: 10.1109/T-AFFC.2010.10.Google ScholarDigital Library
- Y. Wang , “A systematic review on affective computing: emotion models , databases , and recent advances,” Inf. Fusion, vol. 83–84, no. July 2021, pp. 19–52, 2022, doi: 10.1016/j.inffus.2022.03.009.Google ScholarCross Ref
- S. Poria, E. Cambria, R. Bajpai, and A. Hussain, “A review of affective computing: From unimodal analysis to multimodal fusion,” Inf. Fusion, vol. 37, pp. 98–125, 2017, doi: 10.1016/j.inffus.2017.02.003.Google ScholarDigital Library
- P. Ekman, “Basic Emotions,” IT. Dalgleish M. Power (Eds.). Handb. Cogn. Emot. Sussex, U.K. John Wiley Sons, Ltd., 1999, no. 1992, 1999.Google ScholarCross Ref
- A. S. Cowen and D. Keltner, “Self-report captures 27 distinct categories of emotion bridged by continuous gradients,” 2017, doi: 10.1073/pnas.1702247114/-/DCSupplemental.www.pnas.org/cgi/doi/10.1073/pnas.1702247114.Google ScholarCross Ref
- S. Baloch, S. A. R. S. A. Bakar, M. M. Mokji, and S. Waseem, “Affect Recognition Using Dynamic Characteristics of Motion,” in 2021 IEEE International Conference on Signal and Image Processing Applications (ICSIPA), 2021, pp. 55–59, doi: 10.1109/ICSIPA52582.2021.9576772.Google ScholarCross Ref
- R. A. Calvo and S. D'Mello, “Affect detection: An interdisciplinary review of models, methods, and their applications,” IEEE Trans. Affect. Comput., vol. 1, no. 1, pp. 18–37, 2010, doi: 10.1109/T-AFFC.2010.1.Google ScholarDigital Library
- H. Al Osman and T. H. Falk, “Multimodal Affect Recognition: Current Approaches and Challenges,” Emot. Atten. Recognit. Based Biol. Signals Images, 2017, doi: 10.5772/65683.Google ScholarCross Ref
- A. S. Patwardhan, “Hostile behavior detection from multiple view points using RGB-D sensor,” 2017 IEEE SmartWorld Ubiquitous Intell. Comput. Adv. Trust. Comput. Scalable Comput. Commun. Cloud Big Data Comput. Internet People Smart City Innov. SmartWorld/SCALCOM/UIC/ATC/CBDCom/IOP/SCI 2017 - , pp. 1–6, 2018, doi: 10.1109/UIC-ATC.2017.8397461.Google Scholar
- C. Corneanu, F. Noroozi, D. Kaminska, T. Sapinski, S. Escalera, and G. Anbarjafari, “Survey on Emotional Body Gesture Recognition,” IEEE Trans. Affect. Comput., vol. 12, no. 2, pp. 505–523, 2021, doi: 10.1109/TAFFC.2018.2874986.Google ScholarDigital Library
- T. Sapiński, D. Kamińska, A. Pelikant, and G. Anbarjafari, “Emotion recognition from skeletal movements,” Entropy, vol. 21, no. 7, pp. 1–16, 2019, doi: 10.3390/e21070646.Google ScholarCross Ref
- G. Castellano, S. D. Villalba, and A. Camurri, “Recognising human emotions from body movement and gesture dynamics,” in Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 2007, vol. 4738 LNCS, pp. 71–82, doi: 10.1007/978-3-540-74889-2_7.Google ScholarDigital Library
- H. Gunes and M. Piccardi, “Bimodal face and body gesture database for automatic analysis of human nonverbal affective behavior,” in Proceedings - International Conference on Pattern Recognition, 2006, vol. 1, no. January 2006, pp. 1148–1153, doi: 10.1109/ICPR.2006.39.Google ScholarDigital Library
- T. Banziger, M. Mortillaro, and K. R. Scherer, “Introducing the Geneva Multimodal Expression Corpus for Experimental Research on Emotion Perception,” Emotion, vol. 12, no. 5, pp. 1161–1179, 2012, doi: 10.1037/a0025827.Google ScholarCross Ref
- P. M. Muller, S. Amin, P. Verma, M. Andriluka, and A. Bulling, “Emotion recognition from embedded bodily expressions and speech during dyadic interactions,” in 2015 International Conference on Affective Computing and Intelligent Interaction, ACII 2015, 2015, pp. 663–669, doi: 10.1109/ACII.2015.7344640.Google ScholarDigital Library
- D. Avola, L. Cinque, A. Fagioli, G. L. Foresti, and C. Massaroni, “Deep temporal analysis for non-acted body affect recognition,” arXiv, vol. 3045, no. c, pp. 1–12, 2019, doi: 10.1109/taffc.2020.3003816.Google Scholar
- P. V Rouast, M. Adam, and R. Chiong, “Deep Learning for Human Affect Recognition: Insights and New Developments,” IEEE Trans. Affect. Comput., vol. 14, no. 8, pp. 1–20, 2018, doi: 10.1109/TAFFC.2018.2890471.Google ScholarDigital Library
- A. Camurri, B. Mazzarino, M. Ricchetti, R. Timmers, and G. Volpe, “Multimodal analysis of expressive gesture in music and dance performances,” in Gesture-Based Communication in Human-Computer Interaction, vol. 2915, no. April, 2003, pp. 409–420.Google Scholar
- D. Bernhardt and P. Robinson, “Detecting affect from non-stylised body motions,” in Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 2007, vol. 4738 LNCS, pp. 59–70, doi: 10.1007/978-3-540-74889-2_6.Google ScholarDigital Library
- C. Shan, S. Gong, and P. W. McOwan, “Beyond Facial Expressions: Learning Human Emotion from Body Gestures. BT - Proceedings of the British Machine Vision Conference 2007, University of Warwick, UK, September 10-13, 2007.” pp. 1–10, 2007, doi: 10.5244/C.21.43.Google ScholarCross Ref
- G. Castellano, M. Mortillaro, A. Camurri, G. Volpe, and K. Scherer, “Automated analysis of body movement in emotionally expressive piano performances,” Music Percept., vol. 26, no. 2, pp. 103–119, 2008, doi: 10.1525/mp.2008.26.2.103.Google ScholarCross Ref
- D. Glowinski, A. Camurri, G. Volpe, N. Dael, and K. Scherer, “Technique for automatic emotion recognition by body gesture analysis,” in 2008 IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops, CVPR Workshops, 2008, no. June, doi: 10.1109/CVPRW.2008.4563173.Google ScholarCross Ref
- H. Gunes, M. Piccardi, and S. Member, “Automatic Temporal Segment Detection and Affect Recognition From Face and Body Display,” IEEE Trans. Syst. Man, Cybern. Part B, vol. 39, no. 1, pp. 64–84, 2009, doi: 10.1109/TSMCB.2008.927269.Google ScholarDigital Library
- D. Glowinski, N. Dael, A. Camurri, G. Volpe, M. Mortillaro, and K. Scherer, “Toward a minimal representation of affective gestures,” IEEE Trans. Affect. Comput., vol. 2, no. 2, pp. 106–118, 2011, doi: 10.1109/T-AFFC.2011.7.Google ScholarDigital Library
- S. Chen, Y. Tian, Q. Liu, and D. N. Metaxas, “Recognizing expressions from face and body gesture by temporal normalized motion and appearance features ☆,” IMAVIS, vol. 31, no. 2, pp. 175–185, 2013, doi: 10.1016/j.imavis.2012.06.014.Google ScholarDigital Library
- H. Gunes, C. Shan, S. Chen, and Y. Tian, “BODILY EXPRESSION FOR AUTOMATIC AFFECT RECOGNITION–1,” 2015.Google ScholarCross Ref
- Z. Shen, J. Cheng, X. Hu, and Q. Dong, “Emotion Recognition Based on Multi-View Body Gestures,” 2019 IEEE Int. Conf. Image Process., pp. 3317–3321, 2019.Google ScholarCross Ref
- A. Kapur, A. Kapur, N. Virji-Babul, G. Tzanetakis, and P. F. Driessen, “Gesture-based affective computing on motion capture data,” Lect. Notes Comput. Sci. (including Subser. Lect. Notes Artif. Intell. Lect. Notes Bioinformatics), vol. 3784 LNCS, pp. 1–7, 2005, doi: 10.1007/11573548_1.Google ScholarDigital Library
- M. Garber-Barron and M. Si, “Using body movement and posture for emotion detection in non-acted scenarios,” in IEEE International Conference on Fuzzy Systems, 2012, doi: 10.1109/FUZZ-IEEE.2012.6250780.Google ScholarCross Ref
- A. Kleinsmith, N. Bianchi-Berthouze, and A. Steed, “Automatic Recognition of Non-Acted Affective Postures,” CYBERNETICS, vol. 41, no. 4, p. 1027, 2011, doi: 10.1109/TSMCB.2010.2103557.Google ScholarDigital Library
- W. Wang, V. Enescu, and H. Sahli, “Towards Real-Time Continuous Emotion Recognition from Body Movements,” Hum. Behav. Underst. 2013. Lect. Notes Comput. Sci. vol 8212. Springer, Cham, pp. 235–245, 2013.Google ScholarDigital Library
- W. Wang, V. Enescu, and H. Sahli, “Adaptive real-time emotion recognition from body movements,” ACM Trans. Interact. Intell. Syst., vol. 5, no. 4, 2015, doi: 10.1145/2738221.Google ScholarDigital Library
- S. Piana, A. Staglianò, F. Odone, A. Verri, and A. Camurri, “Real-time Automatic Emotion Recognition from Body Gestures,” 2014.Google Scholar
- S. Piana, A. Stagliańo, F. Odone, and A. Camurri, “Adaptive Body Gesture Representation for Automatic Emotion Recognition,” ACM Trans. Interact. Intell. Syst., vol. 6, no. 1, 2016, doi: 10.1145/2818740.Google ScholarDigital Library
- K. Kaza , “Body motion analysis for emotion recognition in serious games,” in Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 2016, vol. 9738, pp. 33–42, doi: 10.1007/978-3-319-40244-4_4.Google Scholar
- A. Kleinsmith and N. Bianchi-berthouze, “Affective Body Expression Perception and Recognition: A Survey,” IEEE Trans. Affect. Comput., vol. 4, no. 1, pp. 15–33, 2013, doi: 10.1109/T-AFFC.2012.16.Google ScholarDigital Library
- B. De Gelder, “Why bodies? Twelve reasons for including bodily expressions in affective neuroscience,” Philosophical Transactions of the Royal Society B: Biological Sciences, vol. 364, no. 1535. pp. 3475–3484, 2009, doi: 10.1098/rstb.2009.0190.Google ScholarCross Ref
- H. Zacharatos, C. Gatzoulis, and Y. Chrysanthou, “Automatic Emotion Recognition based on Body Movement Analysis: A Survey,” IEEE Comput. Graph. Appl., vol. 34, no. 6, pp. 35–45, 2014, doi: doi: 10.1109/MCG.2014.106.Google ScholarCross Ref
- Z. Cao, G. Hidalgo, T. Simon, S.-E. Wei, and Y. Sheikh, “OpenPose: Realtime Multi-Person 2D Pose Estimation Using Part Affinity Fields.,” IEEE Trans. Pattern Anal. Mach. Intell., vol. 43, no. 1, pp. 172–186, Jan. 2021, doi: 10.1109/TPAMI.2019.2929257.Google ScholarDigital Library
- N. U. R. Malik, S. A. R. A. Bakar, and U. U. Sheikh, “A Simplified Skeleton Joints Based Approach For Human Action Recognition,” in 2021 IEEE International Conference on Signal and Image Processing Applications (ICSIPA), 2021, pp. 72–76, doi: 10.1109/ICSIPA52582.2021.9576770.Google ScholarCross Ref
- A. Patwardhan and G. Knapp, “Aggressive actions and anger detection from multiple modalities using Kinect,” 2016.Google Scholar
Index Terms
- Affect recognition using simplistic 2D skeletal features from the upper body movement
Recommendations
Mood recognition based on upper body posture and movement features
ACII'11: Proceedings of the 4th international conference on Affective computing and intelligent interaction - Volume Part IWhile studying body postures in relation to mood is not a new concept, the majority of these studies rely on actors interpretations. This project investigated the temporal aspects of naturalistic body postures while users listened to mood inducing ...
Emotion Recognition Using Physiological Signals
MIDI '15: Proceedings of the Mulitimedia, Interaction, Design and InnnovationIn this paper the problem of emotion recognition using physiological signals is presented. Firstly the problems with acquisition of physiological signals related to specific human emotions are described. It is not a trivial problem to elicit real ...
Learning Gait Emotions Using Affective and Deep Features
MIG '22: Proceedings of the 15th ACM SIGGRAPH Conference on Motion, Interaction and GamesWe present a novel data-driven algorithm to learn the perceived emotions of individuals based on their walking motion or gaits. Given an RGB video of an individual walking, we extract their walking gait as a sequence of 3D poses. Our goal is to exploit ...
Comments