Skip to main content

Advertisement

Log in

Modeling the Interactions of Context and Style on Affect in Motion Perception: Stylized Gaits Across Multiple Environmental Contexts

  • Published:
International Journal of Social Robotics Aims and scope Submit manuscript

Abstract

As more and more robots move into social settings, humans will be monitoring the external motion profile of counterparts in order to make judgments about the internal state of these counterparts. This means that generating motion with an understanding of how humans will interpret it is paramount. This paper investigates the connection between environmental context, stylized gaits, and perception via a model of affect parameterized by valence and arousal. The predictive model proposed indicates that, for the motion stimuli used, environmental context has a larger influence on valence and style of walking has a larger influence on arousal. This work expands on previous research in affect recognition by exploring the critical relationship between environmental context, stylized gait, and affective perception. The results of this work indicate that social behavior of robots may be informed by environmental context for improved performance.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7

Similar content being viewed by others

References

  1. Studd K, Cox LL (2013) Everybody is a body. Dog Ear Publishing, Indianapolis

    Google Scholar 

  2. LaViers A, Bai L, Bashiri M, Heddy G, Sheng Y (2016) Abstractions for design-by-humans of heterogeneous behaviors. In: Laumond J-P, Abe N (eds) Dance notations and robot motion. Springer, Berlin, pp 237–262

    Chapter  Google Scholar 

  3. LaViers A, Egerstedt M (2012) Style-based robotic motion. In: Proceedings of the American control conference. IEEE, pp 4327–4332

  4. Burton SJ, Samadani A-A, Gorbet R, Kulić D (2016) Laban movement analysis and affective movement generation for robots and other near-living creatures. In: Laumond J-P, Abe N (eds) Dance notations and robot motion. Springer, Berlin, pp 25–48

    Chapter  Google Scholar 

  5. Masuda M, Kato S, Itoh H (2009) Emotion detection from body motion of human form robot based on laban movement analysis. In: International conference on principles and practice of multi-agent systems. Springer, Berlin, pp 322–334

  6. Masuda M, Kato S (2010) Motion rendering system for emotion expression of human form robots based on laban movement analysis. In: 19th international symposium in robot and human interactive communication. IEEE, pp 324–329

  7. Rett J, Dias J (2007) Human–robot interface with anticipatory characteristics based on laban movement analysis and bayesian models. In: 2007 IEEE 10th international conference on rehabilitation robotics. IEEE, pp 257–268

  8. Khoshhal K, Aliakbarpour H, Quintas J, Hofmann M, Dias J (2011) Probabilistic LMA-based human motion analysis by conjugating frequency and spatial based features. In: WIAMIS 2011: 12th international workshop on image analysis for multimedia interactive services, Delft, The Netherlands, April 13–15, 2011. TU Delft; EWI; MM; PRB

  9. Chi D, Costa M, Zhao L, Badler N (2000) The emote model for effort and shape. In: Proceedings of the 27th annual conference on computer graphics and interactive techniques. ACM Press/Addison-Wesley Publishing Co, New York, pp 173–182

  10. LaViers A, Egerstedt M (2014) Controls and art: inquiries at the intersection of the subjective and the objective. Springer, Berlin

    Book  MATH  Google Scholar 

  11. Knight H, Simmons R (2014) Expressive motion with x, y and theta: Laban effort features for mobile robots. In: The 23rd IEEE international symposium on robot and human interactive communication. IEEE, pp 267–273

  12. Knight H, Simmons R (2015) Layering laban effort features on robot task motions. In: Proceedings of the tenth annual ACM/IEEE international conference on human–robot interaction extended abstracts. ACM, pp 135–136

  13. Knight H, Simmons R (2016) Laban head-motions convey robot state: a call for robot body language. In: 2016 IEEE international conference on robotics and automation (ICRA). IEEE, pp 2881–2888

  14. Levy JA, Duke MP (2003) The use of laban movement analysis in the study of personality, emotional state and movement style: an exploratory investigation of the veridicality of” body language. Individ Differ Res 1(1):39–63

    Google Scholar 

  15. Lourens T, Van Berkel R, Barakova E (2010) Communicating emotions and mental states to robots in a real time parallel framework using laban movement analysis. Robot Auton Syst 58(12):1256–1265

    Article  Google Scholar 

  16. Samadani A-A, Burton S, Gorbet R, Kulic D (2013) Laban effort and shape analysis of affective hand and arm movements. In: 2013 Humaine association conference on affective computing and intelligent interaction (ACII). IEEE, pp 343–348

  17. Laban R, Lawrence F C (1947) Effort. Macdonald & Evans, London

    Google Scholar 

  18. Fdili Alaoui S, Carlson K, Cuykendall S, Bradley K, Studd K, Schiphorst T (2015) How do experts observe movement? In: Proceedings of the 2nd international workshop on movement and computing. ACM, pp 84–91

  19. Frijda NH (1988) The laws of emotion. American psychologist 43(5):349

    Article  Google Scholar 

  20. Mehrabian A, Russell J A (1974) An approach to environmental psychology. The MIT Press, Cambridge

    Google Scholar 

  21. Mehrabian A (1995) Framework for a comprehensive description and measurement of emotional states. Genet Soc Gen Psychol Monogr 121:339–361

    Google Scholar 

  22. Russell JA (1980) A circumplex model of affect. J Personal Soc Psychol 39(6):1161–1178

    Article  Google Scholar 

  23. Lang PJ (1980) Behavioral treatment and bio-behavioral assessment: computer applications. In: Sidowski JB, Johnson JH, Williams TA (eds) Technology in mental health care delivery systems. pp. 119–l37. http://www.citeulike.org/group/13427/article/720885

  24. Hodes RL, Cook EW, Lang PJ (1985) Individual differences in autonomic response: conditioned association or conditioned fear? Psychophysiology 22(5):545–560

    Article  Google Scholar 

  25. Bradley MM, Lang PJ (1994) Measuring emotion: the self-assessment manikin and the semantic differential. J Behav Therapy Exp Psychiatry 25(1):49–59

    Article  Google Scholar 

  26. Valdez P, Mehrabian A (1994) Effects of color on emotions. J Exp Psychol Gen 123(4):394

    Article  Google Scholar 

  27. Paltoglou G, Thelwall M (2013) Seeing stars of valence and arousal in blog posts. IEEE Trans Affect Comput 4(1):116–123

    Article  Google Scholar 

  28. Kleinsmith A, De Silva PR, Bianchi-Berthouze N (2005) Grounding affective dimensions into posture features. In: International conference on affective computing and intelligent interaction. Springer, Berlin, pp 263–270

  29. Dan-Glauser ES, Scherer KR (2011) The geneva affective picture database (gaped): a new 730-picture database focusing on valence and normative significance. Behav Res Methods 43(2):468

    Article  Google Scholar 

  30. Kurdi B, Lozano S, Banaji MR (2017) Introducing the open affective standardized image set (oasis). Behav Res Methods 49(2):457–470

    Article  Google Scholar 

  31. Kleinsmith A, Bianchi-Berthouze N (2013) Affective body expression perception and recognition: a survey. IEEE Trans Affect Comput 4(1):15–33

    Article  Google Scholar 

  32. Roether CL, Omlor L, Christensen A, Giese MA (2009) Critical features for the perception of emotion from gait. J Vis 9(6):15–15

    Article  Google Scholar 

  33. Russell JA, Fehr B (1987) Relativity in the perception of emotion in facial expressions. J Exp Psychol Gen 116(3):223

    Article  Google Scholar 

  34. Zacharatos H, Gatzoulis C, Chrysanthou YL (2014) Automatic emotion recognition based on body movement analysis: a survey. IEEE Comput Graph Appl 34(6):35–45

    Article  Google Scholar 

  35. Van den Stock J, Righart R, De Gelder B (2007) Body expressions influence recognition of emotions in the face and voice. Emotion 7(3):487

    Article  Google Scholar 

  36. de Gelder B, Meeren HK, Righart R, Van den Stock J, Van de Riet WA, Tamietto M (2006) Beyond the face: exploring rapid influences of context on face processing. Prog Brain Res 155:37–48

    Article  Google Scholar 

  37. Read R, Belpaeme T (2014) Situational context directs how people affectively interpret robotic non-linguistic utterances. In: Proceedings of the 2014 ACM/IEEE international conference on Human-robot interaction. ACM, pp 41–48

  38. Zeng Z, Pantic M, Roisman GI, Huang TS (2009) A survey of affect recognition methods: audio, visual, and spontaneous expressions. IEEE Trans Pattern Anal Mach Intell 31(1):39–58

    Article  Google Scholar 

  39. Dils A, Albright AC (2001) Moving history/dancing cultures: a dance history reader. Wesleyan University Press, Middletown

    Google Scholar 

  40. Rosenthal-von der Pütten AM, Krämer NC, Hoffmann L, Sobieraj S, Eimler SC (2013) An experimental study on emotional reactions towards a robot. Int J Soc Robot 5(1):17–34

    Article  Google Scholar 

  41. Breazeal C L (2004) Designing sociable robots. MIT press, Cambridge

    Book  MATH  Google Scholar 

  42. Knight H, Veloso M, Simmons R (2015) Taking candy from a robot: speed features and candy accessibility predict human response. In: 2015 24th IEEE international symposium on robot and human interactive communication (RO-MAN). IEEE, pp 355–362

  43. Breazeal C, Scassellati B (1999) A context-dependent attention system for a social robot. In: rn, vol 255, p 3

  44. Knight H, Gray M (2012) Acting lesson with robot: emotional gestures. In: 2012 7th ACM/IEEE international conference on human–robot interaction (HRI). IEEE, pp 407–407

  45. Sharma M, Hildebrandt D, Newman G, Young JE, Eskicioglu R (2013) Communicating affect via flight path exploring use of the laban effort system for designing affective locomotion paths. In: 2013 8th ACM/IEEE international conference on human–robot interaction (HRI). IEEE, pp 293–300

  46. Dragan AD, Lee KC, Srinivasa SS (2013) Legibility and predictability of robot motion. In: 2013 8th ACM/IEEE international conference on human–robot interaction (HRI). IEEE, pp 301–308

  47. Rosenthal-von der Pütten AM, Krämer NC, Herrmann J (2018) The effects of humanlike and robot-specific affective nonverbal behavior on perception, emotion, and behavior. Int J Soc Robot 10:1–14

    Article  Google Scholar 

  48. Moshkina L, Park S, Arkin RC, Lee JK, Jung H (2011) Tame: time-varying affective response for humanoid robots. Int J Soc Robot 3(3):207–221

    Article  Google Scholar 

  49. Stephens-Fripp B, Naghdy F, Stirling D, Naghdy G (2017) Automatic affect perception based on body gait and posture: a survey. Int J Soc Robot 9(5):617–641

    Article  Google Scholar 

  50. Venture G, Kadone H, Zhang T, Grèzes J, Berthoz A, Hicheur H (2014) Recognizing emotions conveyed by human gait. Int J Soc Robot 6(4):621–632

    Article  Google Scholar 

  51. Bradley E, Stuart J (1998) Using chaos to generate variations on movement sequences. Chaos Interdiscip J Nonlinear Sci 8(4):800–807

    Article  MATH  Google Scholar 

  52. Brand M, Hertzmann A (2000) Style machines. In: Proceedings of the 27th annual conference on computer graphics and interactive techniques. ACM Press/Addison-Wesley Publishing Co, New York, pp 183–192

  53. Liu CK, Hertzmann A, Popović Z (2005) Learning physics-based motion style with nonlinear inverse optimization. ACM Trans Graph (TOG) 24(3):1071–1081

    Article  Google Scholar 

  54. Torresani L, Hackney P, Bregler C (2007) Learning motion style synthesis from perceptual observations. In: Advances in neural information processing systems, pp 1393–1400

  55. Gillies M (2009) Learning finite-state machine controllers from motion capture data. IEEE Trans Comput Intell AI Games 1(1):63–72

    Article  MathSciNet  Google Scholar 

  56. Etemad SA, Arya A (2016) Expert-driven perceptual features for modeling style and affect in human motion. IEEE Trans Hum Mach Syst 46(4):534–545

    Article  Google Scholar 

  57. Etemad SA, Arya A, Parush A, DiPaola S (2016) Perceptual validity in animation of human motion. Comput Anim Virtual Worlds 27(1):58–71

    Article  Google Scholar 

  58. Ribeiro T, Paiva A (2012) The illusion of robotic life: principles and practices of animation for robots. In: 2012 7th ACM/IEEE international conference on human–robot interaction (HRI). IEEE, pp 383–390

  59. Van Breemen A (2004) Bringing robots to life: applying principles of animation to robots. In: Proceedings of shapping human–robot interaction workshop held at CHI 2004, pp 143–144

  60. Bernhardt D, Robinson P (2007) Detecting affect from non-stylised body motions. In: International conference on affective computing and intelligent interaction. Springer, Berlin, pp 59–70

  61. Bernhardt D, Robinson P (2009) Detecting emotions from connected action sequences. In: Visual informatics: bridging research and practice, pp 1–11

  62. Etemad SA, Arya A (2010) Modeling and transformation of 3D human motion. In: GRAPP, pp 307–315

  63. Heimerdinger M, LaViers A (2017) Influence of environmental context on recognition rates of stylized walking sequences. In: International conference on social robotics. Springer, Berlin, pp 272–282

  64. Russell JA, Mehrabian A (1977) Evidence for a three-factor theory of emotions. J Res Personal 11(3):273–294

    Article  Google Scholar 

  65. Russell JA (1979) Affective space is bipolar. J Personal Soc Psychol 37(3):345

    Article  Google Scholar 

  66. Agresti A, Kateri M (2011) Categorical data analysis. In: International encyclopedia of statistical science. Springer, Berlin, pp 206–208

  67. Reeser TW (2011) Masculinities in theory: an introduction. Wiley, New York

    Google Scholar 

  68. Pedhazur EJ, Tetenbaum TJ (1979) Bem sex role inventory: a theoretical and methodological critique. J Personal Soc Psychol 37(6):996

    Article  Google Scholar 

Download references

Acknowledgements

This work was conducted under IRB #17697 supported by NSF Grants #1701295 and #1528036. Training activities funded by DARPA Grant #D16AP00001 and led by Catherine Maguire, Catie Cuan, and Riley Watts were critical to the paper. The authors also want to thank Lisa LaViers, a researcher in the Accounting Department at Emory University, for help in designing the implementation of the third study.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Amy LaViers.

Ethics declarations

Conflict of interest

A. LaViers owns stock in AE Machines, an automation software company.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Appendix

Appendix

The demographics of participants across all three studies is provided here. These questions were asked so that more details about the diversity of the sample populations would be available to researchers conducting similar studies in the future (Tables 4, 5, 6, 7, 8, 9).

Table 4 Highest level of education completed by participants
Table 5 Native languages of participants
Table 6 Other languages spoken by participants
Table 7 Countries where participants spent the majority of their childhood
Table 8 Developed environment classifications of the areas where participants resided for the bulk of their childhoods
Table 9 Political beliefs of participants

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Heimerdinger, M., LaViers, A. Modeling the Interactions of Context and Style on Affect in Motion Perception: Stylized Gaits Across Multiple Environmental Contexts. Int J of Soc Robotics 11, 495–513 (2019). https://doi.org/10.1007/s12369-019-00514-1

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12369-019-00514-1

Keywords

Navigation