Abstract
Affective movement will likely be an important component of robotic interaction as more and more robots move into human-facing scenarios where humans are (consciously or unconsciously) constantly monitoring the motion profile of counterparts in order to make judgments about the state of their counterpart. Many current studies in affective movement recognition and generation seek to either increase a machine’s ability to correctly identify human affect or to identify and create components of robotic movement that enhance human perception. However, very few of these studies investigate the influence of environmental context on a machine’s ability to correctly identity human affect or a human’s ability to correctly identify the affective intent of a robot. This paper presents the results of a user study that investigated how human perception of stylized walking sequences (created in [1]) varied based on the environment where they were portrayed. The results show that environment context can impact a person’s ability to correctly perceive the intended style of a movement.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Etemad, S.A., Arya, A.: Expert-driven perceptual features for modeling style and affect in human motion. IEEE Trans. Hum. Mach. Syst. 46(4), 534–545 (2016)
Kleinsmith, A., Bianchi-Berthouze, N.: Affective body expression perception and recognition: a survey. IEEE Trans. Affect. Comput. 4(1), 15–33 (2013)
Russell, J.A., Fehr, B.: Relativity in the perception of emotion in facial expressions. J. Exp. Psychol. Gen. 116(3), 223 (1987)
Zacharatos, H., Gatzoulis, C., Chrysanthou, Y.L.: Automatic emotion recognition based on body movement analysis: a survey. IEEE Comput. Graph. Appl. 34(6), 35–45 (2014)
Zeng, Z., Pantic, M., Roisman, G.I., Huang, T.S.: A survey of affect recognition methods: audio, visual, and spontaneous expressions. IEEE Trans. Pattern Anal. Mach. Intell. 31(1), 39–58 (2009)
Bernhardt, D., Robinson, P.: Detecting affect from non-stylised body motions. In: Paiva, A.C.R., Prada, R., Picard, R.W. (eds.) ACII 2007. LNCS, vol. 4738, pp. 59–70. Springer, Heidelberg (2007). doi:10.1007/978-3-540-74889-2_6
Bernhardt, D., Robinson, P.: Detecting emotions from connected action sequences. In: Badioze Zaman, H., Robinson, P., Petrou, M., Olivier, P., Schröder, H., Shih, T.K. (eds.) IVIC 2009. LNCS, vol. 5857, pp. 1–11. Springer, Heidelberg (2009). doi:10.1007/978-3-642-05036-7_1
Etemad, S.A., Arya, A.: Modeling and transformation of 3D human motion. In: GRAPP, pp. 307–315 (2010)
Ribeiro, T., Paiva, A.: The illusion of robotic life: principles and practices of animation for robots. In: 2012 7th ACM/IEEE International Conference on Human-Robot Interaction (HRI), pp. 383–390. IEEE (2012)
Van Breemen, A.: Bringing robots to life: applying principles of animation to robots. In: Proceedings of Shapping Human-Robot Interaction Workshop Held at CHI 2004, pp. 143–144 (2004)
LaViers, A., Egerstedt, M.: Controls and Art: Inquiries at the Intersection of the Subjective and the Objective. Springer, Cham (2014). doi:10.1007/978-3-319-03904-6_1
Harmon-Jones, E., Gable, P.A., Price, T.F.: Does negative affect always narrow and positive affect always broaden the mind? Considering the influence of motivational intensity on cognitive scope. Curr. Dir. Psychol. Sci. 22(4), 301–307 (2013)
Russell, J.A.: A circumplex model of affect. J. Pers. Soc. Psychol. 39(6), 1161–1178 (1980)
Paltoglou, G., Thelwall, M.: Seeing stars of valence and arousal in blog posts. IEEE Trans. Affect. Comput. 4(1), 116–123 (2013)
Kurdi, B., Lozano, S., Banaji, M.R.: Introducing the open affective standardized image set (OASIS). Behav. Res. Methods 49(2), 457–470 (2017)
Dan-Glauser, E.S., Scherer, K.R.: The geneva affective picture database (GAPED): a new 730-picture database focusing on valence and normative significance. Behav. Res. Methods 43(2), 468 (2011)
Barsade, S.G., Gibson, D.E.: Why does affect matter in organizations? Acad. Manag. Perspect. 21(1), 36–59 (2007)
Acknowledgments
This work was supported by the Mechanical Science and Engineering Department and NSF grants #1701295 and #1528036.
Author information
Authors and Affiliations
Corresponding authors
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2017 Springer International Publishing AG
About this paper
Cite this paper
Heimerdinger, M., LaViers, A. (2017). Influence of Environmental Context on Recognition Rates of Stylized Walking Sequences. In: Kheddar, A., et al. Social Robotics. ICSR 2017. Lecture Notes in Computer Science(), vol 10652. Springer, Cham. https://doi.org/10.1007/978-3-319-70022-9_27
Download citation
DOI: https://doi.org/10.1007/978-3-319-70022-9_27
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-70021-2
Online ISBN: 978-3-319-70022-9
eBook Packages: Computer ScienceComputer Science (R0)