Skip to main content

Motion Control for Social Behaviors

  • Chapter
  • First Online:

Part of the book series: Human–Computer Interaction Series ((HCIS))

Abstract

Creating social robots that can interact with humans autonomously is a growing and promising field of research. Indeed, there has been a significant increase in the number of platforms and applications for social robots. However, robots are not yet able to interact with humans in a natural and believable way. This is especially true for physically realistic robot that can be affected by the Uncanny Valley. This chapter is looking at motion control for a physically realistic robot named Nadine. Robot controllers for such robot need to produce behaviours that match the physical realism of the robot. This chapter describes a robot controller that allows such a robot to fully use the same modalities as humans during interaction. These include speech, facial and bodily expressions.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD   109.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Notes

  1. 1.

    http://www.aldebaran-robotics.com/.

  2. 2.

    http://www.aldebaran-robotics.com/.

  3. 3.

    www.research.philips.com/technologies/projects/robotics.

  4. 4.

    http://www.iit.it/en/research/departments/icub-facility.html.

  5. 5.

    http://www.robotcub.org/.

  6. 6.

    http://www.geminoid.jp/en/index.html.

  7. 7.

    http://www.kokoro-dreams.co.jp/english/.

  8. 8.

    http://www.cereproc.com/en/products/sdk.

  9. 9.

    http://imi.ntu.edu.sg/Pages/Home.aspx.

  10. 10.

    http://thrift.apache.org/.

References

  1. Argyle M (1975) Bodily communication. Methuen, London

    Google Scholar 

  2. Aryel B, Lola C, Luisa D, Giacomo S, Fabio T, Piero C (2011) Children interpretation of emotional body language displayed by a robot. Soc Robot (2011-01-01):62–70

    Google Scholar 

  3. Atkinson AP, Dittrich WH, Gemmell AJ, Young AW (2004) Emotion perception from dynamic and static body expressions in point-light and full-light displays. Perception 33(6):717–746

    Article  Google Scholar 

  4. Barakova El, Tourens T (2010) Expressing and interpreting emotional movements in social games with robots. Personal Ubiquitous Comput 14:457–467

    Google Scholar 

  5. Beck A, Cañamero L, Hiolle A, Damiano L, Cosi P, Tesser F, Sommavilla G (2013) Interpretation of emotional body language displayed by a humanoid robot: a case study with children. Int J Soc Robot 5(3):325–334

    Article  Google Scholar 

  6. Beck A, Hiolle A, Cañamero L (2013) Using perlin noise to generate emotional expressions in a robot. In: Proceedings of annual meeting of the cognitive science society (Cog Sci 2013), pp 1845–1850

    Google Scholar 

  7. Beck A, Hiolle A, Mazel A, Cañamero L (2010) Interpretation of emotional body language displayed by robots. In: Proceedings of the 3rd international workshop on affective interaction in natural environments. ACM, pp 37–42

    Google Scholar 

  8. Beck A, Stevens B, Bard KA, Cañamero L (2012) Emotional body language displayed by artificial agents. ACM Trans Inter Intell Syst 2(1):2:1–2:29

    Google Scholar 

  9. Bee N, Haring M, Andre E (2011) Creation and evaluation of emotion expression with body movement, sound and eye color for humanoid robots. In: Ro-Man 2011, IEEE, pp 204–209

    Google Scholar 

  10. Belpaeme T, Baxter PE, Read R, Wood R, Cuayáhuitl H, Kiefer B, Racioppa S, Kruijff-Korbayová I, Athanasopoulos G, Enescu V et al (2012) Multimodal child-robot interaction: building social bonds. J Hum-Robot Inter 1(2):33–53

    Google Scholar 

  11. Bernhardt D (2010) Emotion inference from human body. PhD thesis, University of Cambridge, Computer Laboratory

    Google Scholar 

  12. Bethel CL, Murphy RR (2008) Survey of non-facial/non-verbal affective expressions for appearance-constrained robots. IEEE Trans Syst Man Cybern Part C: Appl Rev 38(1):83–92

    Google Scholar 

  13. Breazeal C (2002) Designing sociable robots. Intelligent robotics and autonomous agents. MIT press, Cambridge

    Google Scholar 

  14. Cai B, Zhang Y (2012) Different-level redundancy-resolution and its equivalent relationship analysis for robot manipulators using gradient-descent and zhang ’s neural-dynamic methods. IEEE Trans Ind Electron 59(8):3146–3155

    Article  MathSciNet  Google Scholar 

  15. Camurri A, Mazzarino B, Volpe G (2003) Analysis of expressive gesture: the eyesweb expressive gesture processing library. In: Gesture-based communication in human-computer interaction. LNAI, pp 460–467

    Google Scholar 

  16. Chan TF, Dubey RV (1995) A weighted least-norm solution based scheme for avoiding joint limits for redundant joint manipulators. IEEE Trans Robot Autom 11(2):286–292

    Google Scholar 

  17. Cheng F-T, Chen T-H, Sun Y-Y (1994) Resolving manipulator redundancy under inequality constraints. IEEE Trans Robot Autom 10(1):65–71

    Article  Google Scholar 

  18. Coombes SA, Cauraugh JH, Janelle CM (2006) Emotion and movement: activation of defensive circuitry alters the magnitude of a sustained muscle contraction. Neurosci Lett 396(3):192–196

    Google Scholar 

  19. Dautenhahn K (2007) Socially intelligent robots: dimensions of human-robot interaction. Philos Trans Royal Soc B: Biol Sci 362(1480):679–704

    Article  Google Scholar 

  20. Dautenhahn K, Nehaniv CL, Walters ML, Robins B, Kose-Bagci H, Blow M (2009) Kaspar—a minimally expressive humanoid robot for human–robot interaction research. Appl Bionics Biomech 6(3, 4):369–397

    Google Scholar 

  21. De Silva PR, Bianchi-Berthouze N (2004) Modeling human affective postures: an information theoretic characterization of posture features. Comput Animation Virtual Worlds 15(3–4):269–276

    Article  Google Scholar 

  22. Fong T, Nourbakhsh I, Dautenhahn K (2003) A survey of socially interactive robots. Robot Auton Syst 42(3):143–166

    Article  MATH  Google Scholar 

  23. Guo D, Zhang Y (2012) A new inequality-based obstacle-avoidance mvn scheme and its application to redundant robot manipulators. IEEE Trans Syst Man Cybern Part C: Appl Rev 42(6):1326–1340

    Article  Google Scholar 

  24. Hartmann B, Mancini M, Buisine S, Pelachaud C (2005) Design and evaluation of expressive gesture synthesis for embodied conversational agents. In: Proceedings of 4th international joint conference on autonomous agents and multiagent systems, AAMAS’05. ACM, New York, NY, USA, pp 1095–1096

    Google Scholar 

  25. Ho C-C, MacDorman KF, Pramono ZADD (2008) Human emotion and the uncanny valley: a glm, mds, and isomap analysis of robot video ratings. In: Proceedings of the 3rd ACM/IEEE international conference on human robot interaction, HRI’08. ACM, New York, NY, USA, pp 169–176

    Google Scholar 

  26. Kanoun O, Lamiraux F, Wieber PB (2011) Kinematic control of redundant manipulators: generalizing the task-priority framework to inequality task. IEEE Trans Robot 27(4):785–792

    Article  Google Scholar 

  27. Kim HJ (2011) Optimization of throwing motion planning for whole-body humanoid mechanism: sidearm and maximum distance. Mech Mach Theory 46(4):438–453

    Article  MATH  Google Scholar 

  28. Kleinsmith A, De Silva PR, Bianchi-Berthouze N (2006) Cross-cultural differences in recognizing affect from body posture. Interact Comput 18(6):1371–1389

    Article  Google Scholar 

  29. Kleinsmith A, Bianchi-Berthouze N, Steed A (2011) Automatic recognition of non-acted affective postures. IEEE transactions on systems man, and cybernetics part B

    Google Scholar 

  30. Laban R, Ullmann L (1971) The mastery of movement. Plays, Inc, Boston

    Google Scholar 

  31. Leite I, Castellano G, Pereira A, Martinho C, Paiva A (2012) Modelling empathic behaviour in a robotic game companion for children: an ethnographic study in real-world settings. In: Proceedings of the seventh annual ACM/IEEE international conference on human-robot interaction, HRI’12. ACM, New York, NY, USA, pp 367–374

    Google Scholar 

  32. Lola C (2006) Did garbo care about the uncanny valley? commentary to K.F. Macdorman and H. Ishiguro, the uncanny advantage of using androids in cognitive and social science research. Inter Stud 7:355–359

    Google Scholar 

  33. Marc C (2004) Attributing emotion to static body postures: recognition accuracy, confusions, and viewpoint dependence. J Nonverbal Behav 28:117–139

    Article  Google Scholar 

  34. Martin S, Christoph B (2010) Perception of affect elicited by robot motion. In: International Conference on Human Robot Interaction, ACM/IEEE

    Google Scholar 

  35. Martins AM, Dias AM, Alsina PJ (2006) Comments on manipulability measure in redundant planar manipulators. In: Proceedings of IEEE Latin American robotics symposium (LARS 06), pp 169–173

    Google Scholar 

  36. Ma S, Watanabe M (2002) Time-optimal control of kinematically redundant manipulators with limit heat characteristics of actuators. Adv Robot 16(8):735–749

    Article  Google Scholar 

  37. Metta G, Sandini G, Vernon D, Natale L, Nori F (2008) The icub humanoid robot: an open platform for research in embodied cognition. In: Proceedings of the 8th workshop on performance metrics for intelligent systems, pp 50–56

    Google Scholar 

  38. Miyashita T, Ishiguro H (2004) Human-like natural behavior generation based on involuntary motions for humanoid robots. Robot Auton Syst 48(4):203–212

    Article  Google Scholar 

  39. Mori M (1970) Bukimi no tani [the un-canny valley]. Energy 7:33–35

    Google Scholar 

  40. Nalin M, Baroni I, Kruijff-Korbayova I, Canamero L, Lewis M, Beck A, Cuayahuitl H, Sanna A (2012) Children’s adaptation in multi-session interaction with a humanoid robot. In: International symposium on robot and human interactive communication (RO-MAN). IEEE

    Google Scholar 

  41. Nunez JV, Briseno A, Rodriguez DA, Ibarra JM, Rodriguez VM (2012) Explicit analytic solution for inverse kinematics of bioloid humanoid robot. In: Robotics symposium and Latin American robotics symposium (SBR-LARS), 2012 Brazilian, pp 33–38

    Google Scholar 

  42. Pierris G, Lagoudakis MG (2009) An interactive tool for designing complex robot motion patterns. In: Proceedings of IEEE international conference on robotics and automation (ICRA 09), pp 4013–4018

    Google Scholar 

  43. Read R, Belpaeme T (2013) People interpret robotic non-linguistic utterances categorically. In: 2013 8th ACM/IEEE international conference on Human-Robot Interaction (HRI), pp 209–210. March 2013

    Google Scholar 

  44. Robins B, Dautenhahn K (2007) Encouraging social interaction skills in children with autism playing with robots: a case study evaluation of triadic interactions involving children with autism, other people (peers and adults) and a robotic toy. ENFANCE 59:72–81

    Article  Google Scholar 

  45. Roether CL, Omlor L, Christensen A, Giese MA (2009) Critical features for the perception of emotion from gait. J Vision 9(6):15

    Google Scholar 

  46. Rosenthal-von der Pütten AM, Krämer NC, Becker-Asano C, Ogawa K, Nishio S, Ishiguro H (2014) The uncanny in the wild. Analysis of unscripted human–android interaction in the field international. J Soc Robot 6(1):67–83

    Google Scholar 

  47. Saerbeck M, Bartneck C (2010) Attribution of affect to robot motion. In: 5th ACM/IEEE international conference on human-robot interaction (HRI2010). ACM, Osaka, pp 53–60

    Google Scholar 

  48. Smith LB, Breazeal C (2007) The dynamic lift of developmental process. Dev Sci 10(1):61–68

    Google Scholar 

  49. Taghirad HD, Nahon M (2008) Kinematic analysis of a macro-micro redundantly actuated parallel manipulator. Adv Robot 22(6–7):657–687

    Article  Google Scholar 

  50. Takahashi Y, Kimura T, Maeda Y, Nakamura T (2012) Body mapping from human demonstrator to inverted-pendulum mobile robot for learning from observation. In: Proceedings of IEEE conference on fuzzy systems (FUZZ-IEEE 2012), pp 1–6

    Google Scholar 

  51. Thomas F, Johnston O (1995) The illusion of life. Abbeville-Press, New-York

    Google Scholar 

  52. Torta E, Oberzaucher J, Werner F, Cuijpers RH, Juola JF (2012) Attitudes towards socially assistive robots in intelligent homes: results from laboratory studies and field trials. Journal of Human-Robot. Interaction 1(2):76–99

    Google Scholar 

  53. van Breemen A, Yan X, Meerbeek B (2005) icat: an animated user-interface robot with personality. In: Proceedings of the fourth international joint conference on autonomous agents and multiagent systems, AAMAS’05. ACM, New York, NY, USA, pp 143–144

    Google Scholar 

  54. Wainer J, Dautenhahn K, Robins B, Amirabdollahian F (2014) A pilot study with a novel setup for collaborative play of the humanoid robot kaspar with children with autism. Int J Soc Robot 6(1):45–65

    Article  Google Scholar 

  55. Wallbott HG (1998) Bodily expression of emotion. Eur J Soc Psychol 28(6):879–896

    Article  Google Scholar 

  56. Walters ML, Dautenhahn K, Boekhorst RT, Koay KL, Syrdal DS, Nehaniv CL (2009) An empirical framework for human-robot proxemics. Procs of new frontiers in human-robot interaction

    Google Scholar 

  57. Wang J, Li Y (2009) Inverse kinematics analysis for the arm of a mobile humanoid robot based on the closed-loop algorithm. In: International conference on information and automation, 2009. ICIA’09. pp 516–521

    Google Scholar 

  58. Wang J, Li Y (2009) Inverse kinematics analysis for the arm of a mobile humanoid robot based on the closed-loop algorithm. In: Proceedings of international conference on information and automation (ICIA 2009), pp. 516–521

    Google Scholar 

  59. Xiao Y, Zhang Z, Beck A, Yuan J, Thalmann D (2014) Human-robot interaction by understanding upper body gestures. Presence 23(2):133–154

    Google Scholar 

  60. Zhang Z (2012) Motion planning and control of redundant manipulator from fixed base to mobile platfrom. Ph.D dissertation, Sun Yat-sen University (2012)

    Google Scholar 

  61. Zhang Y, Huarong W, Zhang Z, Xiao L, Guo Dongsheng (2013) Acceleration-level repetitive motion planning of redundant planar robots solved by a simplified lvi-based primal-dual neural network. Robot Comput-Integr Manuf 29(2):328–343

    Article  Google Scholar 

  62. Zhang Z, Beck A, Thalmann NM Human-like behavior generation based on head-arms model for robot tracking external targets and body parts. IEEE Transaction on Cybernetics, Accepted for publication

    Google Scholar 

  63. Zhang Y, Tan Z, Yang Z, Lv X, Chen K (2008) A simplified lvi-based primal-dual neural network for repetitive motion planning of pa10 robot manipulator starting from different initial states. In: Proceedings of IEEE joint conference on neural networks (IJCNN 2008), pp. 19–24

    Google Scholar 

  64. Zhang Z, Zhang Y (2012) Acceleration-level cyclic-motion generation of constrained redundant robots tracking different paths. IEEE Trans Syst Man Cybern Part B: Cybern 42(4):1257–1269

    Article  Google Scholar 

  65. Zhang Z, Zhang Y (2013) Equivalence of different-level schemes for repetitive motion planning of redundant robots. Acta Automatica Sinica 39(1):88–91

    Article  MathSciNet  Google Scholar 

  66. Zhang Z, Zhang Y (2013) Variable joint-velocity limits of redundant robot manipulators handled by quadratic programming. IEEE/ASME Trans Mechatron 18(2):674–686

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Aryel Beck .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2016 Springer International Publishing Switzerland

About this chapter

Cite this chapter

Beck, A., Zhijun, Z., Magnenat-Thalmann, N. (2016). Motion Control for Social Behaviors. In: Magnenat-Thalmann, N., Yuan, J., Thalmann, D., You, BJ. (eds) Context Aware Human-Robot and Human-Agent Interaction. Human–Computer Interaction Series. Springer, Cham. https://doi.org/10.1007/978-3-319-19947-4_11

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-19947-4_11

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-19946-7

  • Online ISBN: 978-3-319-19947-4

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics