Skip to main content
Log in

Body Language for Personal Robot Arm Assistant

  • Published:
International Journal of Social Robotics Aims and scope Submit manuscript

Abstract

This article presents an exploratory study aimed at: (i) understanding participants’ first impression of the Cyton Gamma 1500, a small machine-like robot arm, and (ii) designing robot body language to convey certain intentions with regard to a personal assistant robotic application. The between-group study comprised of three distinct groups of participants. Each group had a different first encounter with the robot arm: the first group saw an idly sitting robot arm; the second group watched another person working with the robot; lastly, the third group worked with the robot themselves. They rated the robot arm on a bipolar adjective questionnaire after the first encounter which gives us insights into participant’s “first impression”. Then they performed a categorization task where they chose the most suitable posture for conveying specific messages. The objective here was to select postures that can easily convey the robot assistant’s intentions to users in future. The two main findings of the study are: (1) participants’ impression of the robot arm does not differ significantly across the three groups despite having different “first encounter” with the robot, and (2) certain postures are easily relatable to the message they convey (like, Robot is saying “Hi!”), while certain others are not (like, Robot has surrendered). In light of phenomenological theories of communication, it was concluded that tempered forms of anthropomorphic or zoomorphic postures are more easily identified than completely abstract postures. The findings of this study can be used to design robot-specific body language for robot arms.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2

©2016 IEEE. Reprinted, with permission, from 25th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN), 2016

Fig. 3
Fig. 4
Fig. 5
Fig. 6

Similar content being viewed by others

References

  1. Kiesler S, Goetz J (2002) Mental models of robotic assistants. In: CHI ’02 ext. abstr. hum. factors comput. syst., pp 576–577. https://doi.org/10.1145/506443.506491

  2. Walters ML, Syrdal DS, Dautenhahn K, Te Boekhorst R, Koay KL (2008) Avoiding the uncanny valley: robot appearance, personality and consistency of behavior in an attention-seeking home scenario for a robot companion. Auton Robots 24(2):159–178. https://doi.org/10.1007/s10514-007-9058-3

    Article  Google Scholar 

  3. Dautenhahn K, Werry I (2004) Towards interactive robots in autism therapy: background, motivation and challenges. Pragmat Cogn 1(1980):1–35. https://doi.org/10.1075/pc.12.1.03dau

    Article  Google Scholar 

  4. Kidd CD, Breazeal C (2004) Effect of a robot on user perceptions. In: 2004 IEEE/RSJ int. conf. intell. robot. syst. (IEEE Cat. No. 04CH37566), vol 4, pp 3559–3564. https://doi.org/10.1109/IROS.2004.1389967.

  5. Beck A, Stevens B, Bard KA, Cañamero L (2012) Emotional body language displayed by artificial agents. ACM Trans Interact Intell Syst 2(1):1–29. https://doi.org/10.1145/2133366.2133368

    Article  Google Scholar 

  6. Embgen S, Luber M, Becker-Asano C, Ragni M, Evers V, Arras KO, Robot-specific social cues in emotional body language. In: Proc.—IEEE int. work. robot hum. interact. commun., pp. 1019–1025, 2012. https://doi.org/10.1109/ROMAN.2012.6343883.

  7. Sandry E (2015) Re-evaluating the form and communication of social robots. Int J Soc Robot 7(3):335–346. https://doi.org/10.1007/s12369-014-0278-3

    Article  Google Scholar 

  8. Willis J, Todorov A (2006) First impressions. Psychol Sci 17(7):592–598. https://doi.org/10.1111/j.1467-9280.2006.01750.x

    Article  Google Scholar 

  9. Rau PLP, Li Y, Li D (2010) A cross-cultural study: effect of robot appearance and task. Int J Soc Robot 2(2):175–186. https://doi.org/10.1007/s12369-010-0056-9

    Article  Google Scholar 

  10. Bergmann K, Eyssel F, Kopp S (2012) A second chance to make a first impression? How appearance and nonverbal behavior affect perceived warmth and competence of virtual agents over time. In: Lect. notes comput. sci. (including subser. lect. notes artif. intell. lect. notes bioinformatics, vol 7502 LNAI, pp 126–138. https://doi.org/10.1007/978-3-642-33197-8-13.

  11. Kiesler S, Goetz J (2002) Mental models and cooperation with robotic assistants. In: CHI’02 ext. abstr. hum. factors comput. syst., pp 576–577

  12. Cummings K (2011) Nonverbal communication and first impressions

  13. Powers A, Kiesler S (2006) The advisor robot: tracing people’s mental model from a robot’s physical attributes. In: Proceeding 1st ACM SIGCHI/SIGART conf. human-robot interact.—HRI’06, vol 2006, p 218. https://doi.org/10.1145/1121241.1121280

  14. Bruce A, Nourbakhsh I, Simmons R (2002) The role of expressiveness and attention in human–robot interaction. In: Proc. 2002 IEEE int. conf. robot. autom. (Cat. No. 02CH37292), vol 4, pp 38–42. https://doi.org/10.1109/ROBOT.2002.1014396

  15. Syrdal DS, Dautenhahn K, Woods SN, Walters ML, Koay KL (2007) Looking good? Appearance preferences and robot personality inferences at zero acquaintance. In: Proc., pp 86–92 [Online]. Available: http://scholar.google.com/scholar?hl=en&btnG=Search&q=intitle:Looking+Good+?+Appearance+Preferences+and+Robot+Personality+Inferences+at+Zero+Acquaintance#0

  16. Goetz J, Kiesler S, Powers A (2003) Matching robot appearance and behavior to tasks to improve human-robot cooperation. In: Proc.—IEEE int. work. robot hum. interact. commun., pp 55–60. https://doi.org/10.1109/ROMAN.2003.1251796.

  17. Woods S (2006) Exploring the design space of robots: children’s perspectives. Interact Comput 18(6):1390–1418. https://doi.org/10.1016/j.intcom.2006.05.001

    Article  Google Scholar 

  18. Robins B, Dautenhahn K (2004) Robots as assistive technology-does appearance matter? In: Robot hum., pp 277–282. https://doi.org/10.1109/ROMAN.2004.1374773

  19. Friedman B, Kahn PH, Hagman J (2003) Hardware companions ?—What online AIBO discussion forums reveal about the human–robotic relationship. In: Proc. CHI 2003 conf. hum. factors comput. syst., vol 5, pp 273–280. https://doi.org/10.1145/642611.642660

  20. Turkle S, Breazeal C, Dasté O, Scassellati B (2006) Encounters with kismet and cog: children respond to relational artifacts. In: … media transform. …, September, pp 1–20 [Online]. Available: http://web.mit.edu/~sturkle/www/encounterswithkismet.pdf

  21. Dautenhahn K (1998) The art of designing socially intelligent agents: science, fiction, and the human in the loop. Appl Artif Intell 12(7–8):573–617. https://doi.org/10.1080/088395198117550

    Article  Google Scholar 

  22. Breazeal C (2004) Social interactions in HRI: the robot view. IEEE Trans Syst Man Cybern Part C Appl Rev 34(2):181–186. https://doi.org/10.1109/TSMCC.2004.826268

    Article  Google Scholar 

  23. Dautenhahn K (2007) Socially intelligent robots: dimensions of human–robot interaction. Philos Trans R Soc Lond B Biol Sci 362(1480):679–704. https://doi.org/10.1098/rstb.2006.2004

    Article  Google Scholar 

  24. Jones KS, Schmidlin EA (2011) Human–robot interaction: toward usable personal service robots. Rev Hum Factors Ergon 7(1):100–148. https://doi.org/10.1177/1557234X11410388

    Article  Google Scholar 

  25. Shen Q, Kose-Bagci H, Saunders J, Dautenhahn K (2011) The impact of participants’ beliefs on motor interference and motor coordination in human-humanoid interactions. IEEE Trans Auton Ment Dev 3(1):6–16. https://doi.org/10.1109/TAMD.2010.2089790

    Article  Google Scholar 

  26. Breazeal C (2004) Function meets style: insights from emotion theory applied to HRI. IEEE Trans Syst Man Cybern Part C Appl Rev 34(2):187–194. https://doi.org/10.1109/TSMCC.2004.826270

    Article  Google Scholar 

  27. Heerink M et al. (2012) A field study with primary school children on perception of social presence and interactive behavior with a pet robot. In: Proc.—IEEE int. work. robot hum. interact. commun., pp 1045–1050. https://doi.org/10.1109/ROMAN.2012.6343887

  28. Schmitz M (2011) Concepts for life-like interactive objects, p 157. https://doi.org/10.1145/1935701.1935732

  29. Tzafestas S (2016) Zoomorphic sociorobots. In: Sociorobot world, vol 80. Springer, New York, p vii

  30. Broekens J, Heerink M, Rosendal H (2009) Assistive social robots in elderly care: a review. Gerontechnology 8(2):94–103. https://doi.org/10.4017/gt.2009.08.02.002.00

    Article  Google Scholar 

  31. Heerink M, Krose B, Evers V, Wielinga B (2006) Studying the acceptance of a robotic agent by elderly users. Int J Assist Robot Mechatron 7(3):33–43

    Google Scholar 

  32. Hoffman G, Weinberg G (2010) Shimon: an interactive improvisational robotic marimba player. In: CHI’10 ext. abstr. hum. factors…, pp 3097–3102. https://doi.org/10.1145/1753846.1753925

  33. Hoffman G, Zuckerman O, Hirschberger G, Luria M, Shani-Sherman T (2015) Design and evaluation of a peripheral robotic conversation companion, pp 1–8

  34. Husserl E (1960) Cartesian meditations: an introduction to phenomenology

  35. Thompson E (2001) Empathy and consciousness. J Conscious Stud 8(5–7):1–32

    Google Scholar 

  36. Lévinas E (1989) W3-the Levinas reader

  37. Gibson JJ (2014) The ecological approach to visual perception: classic edition, vol 20, p 346 [Online]. Available: https://books.google.com/books?hl=en&lr=&id=8BSLBQAAQBAJ&pgis=1

  38. Norman DA (1988) The design of everyday things (originally published: the psychology of everyday things). In: The psychology of everyday things

  39. Hartson HR (2003) Cognitive, physical, sensory, and functional affordances in interaction design. Behav Inf Technol 22(5):315–338. https://doi.org/10.1080/01449290310001592587

    Article  Google Scholar 

  40. Duffy BR (2003) Anthropomorphism and the social robot. Robot Auton Syst 42(3–4):177–190. https://doi.org/10.1016/S0921-8890(02)00374-3

    Article  MATH  Google Scholar 

  41. Duffy BR (2002) Anthropomorphism and robotics. Soc. Study Artif Intell Simul Behav

  42. Rosenthal-von der Pütten AM, Krämer NC, Herrmann J (2018) The Effects of humanlike and robot-specific affective nonverbal behavior on perception, emotion, and behavior. Int J Soc Robot. https://doi.org/10.1007/s12369-018-0466-7

    Article  Google Scholar 

  43. Kwak SS, San J, Jung K, Choi J (2017) The effects of organism-versus object-based robot design approaches on the consumer acceptance of domestic robots. Int J Soc Robot 9(3):359–377. https://doi.org/10.1007/s12369-016-0388-1

    Article  Google Scholar 

  44. Novikova J, Ren G, Watts L (2015) It’s not the way you look, it’s how you move: validating a general scheme for robot affective behaviour. In: Human-computer interaction, vol 9298. Springer, New York, pp 239–258

  45. Saunderson S, Nejat G (2019) How robots influence humans: a survey of nonverbal communication in social human–robot interaction, vol 0123456789. Springer, Netherlands

    Google Scholar 

  46. Jacob M, Li YT, Akingba G, Wachs JP (2012) Gestonurse: a robotic surgical nurse for handling surgical instruments in the operating room. J Robot Surg 6(1):53–63. https://doi.org/10.1007/s11701-011-0325-0

    Article  Google Scholar 

  47. Quintero CP, Tatsambon R, Gridseth M, Jagersand M (2015) Visual pointing gestures for bi-directional human robot interaction in a pick-and-place task. Proc—IEEE Int Work Robot Hum Interact Commun 2015:349–354. https://doi.org/10.1109/ROMAN.2015.7333604

    Article  Google Scholar 

  48. Husserl E (1982) General introduction to a pure phenomenology. In: Ideas pertaining to a pure phenomenology and to a phenomenological philosophy, no. 1st bk, pp xxiii, 401 p. https://doi.org/10.2307/2104360

  49. Lee N, Shin H, Shyam Sundar S (2011) Utilitarian vs. hedonic robots: role of parasocial tendency and anthropomorphism in shaping user attitudes. In: HRI 2011—proc. 6th ACM/IEEE int. conf. human–robot interact., pp 183–184 https://doi.org/10.1145/1957656.1957722

  50. Benda HV, Hoyos CG (1983) Estimating hazards in traffic situations. Accid Anal Prev 15(1):1–9. https://doi.org/10.1016/0001-4575(83)90002-7

    Article  Google Scholar 

  51. Borowsky A, Oron-gilad T, Parmet Y (2010) The role of driving experience in hazard perception and categorization: a traffic-scene paradigm. World Acad Sci, Eng, Eng Technol 4(6):305–309

    Google Scholar 

  52. Mechling LC, Gustafson M (2009) Comparison of the effects of static picture and video prompting on completion of cooking related tasks by students with moderate intellectual disabilities. Exceptionality. https://doi.org/10.1080/09362830902805889

    Article  Google Scholar 

  53. Kellems RO, Frandsen K, Cardon TA, Knight K, Andersen M (2018) Effectiveness of static pictures vs. video prompting for teaching functional life skills to students with autism spectrum disorders. Prev Sch Fail. https://doi.org/10.1080/1045988X.2017.1393790

    Article  Google Scholar 

  54. Bash E (2015) Body language for dummies, vol 1

  55. Pease A, Pease B (2006) The definitive book of body language. Bantam 1:400. https://doi.org/10.1162/asep.2010.9.1.54

    Article  Google Scholar 

  56. Bromiley P (2004) Shannon entropy, Renyi entropy, and information. Stat Inf Ser 2004:1–8. https://doi.org/10.1016/j.patrec.2004.03.003

    Article  Google Scholar 

  57. Cieslak DA, Hoens TR, Chawla NV, Kegelmeyer WP (2012) Hellinger distance decision trees are robust and skew-insensitive. Data Min Knowl Discov 24(1):136–158. https://doi.org/10.1007/s10618-011-0222-1

    Article  MathSciNet  MATH  Google Scholar 

  58. Lu Z, Hui YV, Lee H (2003) Minimum Hellinger distance estimation for finite mixtures of Poisson regression models and its applications. Biometrics 59:1016–1026. https://doi.org/10.1111/j.0006-341X.2003.00117.x

    Article  MathSciNet  MATH  Google Scholar 

  59. Hervé M (2018) RVAideMemoire: testing and plotting procedures for biostatistics. https://CRAN.R-project.org/package=RVAideMemoire

  60. McDonald JH (2014) Handbook of biological statistics, 3rd edn. Sparky House Publishing, Baltimore. https://doi.org/10.1017/CBO9781107415324.004

    Book  Google Scholar 

  61. Mehta CR, Patel NR (1983) A network algorithm for performing fisher’s exact test in r × c contingency tables. J Am Stat Assoc. https://doi.org/10.1080/01621459.1983.10477989

    Article  MathSciNet  MATH  Google Scholar 

  62. Wertheimer M (1938) Gestalt theory. A Source B, Gestalt Psychol. https://doi.org/10.1037/h0065396

    Article  Google Scholar 

  63. Todorovic D (2008) Gestalt principles. Scholarpedia 3(12):5345. https://doi.org/10.4249/scholarpedia.5345

    Article  Google Scholar 

  64. Nakata T, Mori T, Sato T (2002) Analysis of impression of robot bodily expression. J Robot Mechatron 14(1):27–36

    Article  Google Scholar 

  65. Fitzgerald C (2013) Developing baxter. IEEE Conf Technol Pract Robot Appl TePRA. https://doi.org/10.1109/TePRA.2013.6556344

    Article  Google Scholar 

  66. Hoffman G (2007) Ensemble: fluency and embodiment for robots acting with humans. Massachusetts Institute of Technology, Cambridge

    Google Scholar 

  67. Turkle S (2006) A nascent robotics culture: new complicities for companionship. Am Assoc Artif Intell [Online]. Available: https://www.student.cs.uwaterloo.ca/~cs492/papers/ST_NascentRobotics Culture.pdf

Download references

Acknowledgements

This research was supported in part by the Helmsley Charitable Trust through the Agricultural, Biological and Cognitive Robotics Initiative, and by the Marcus Endowment Fund, both at Ben-Gurion University of the Negev.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Sridatta Chatterjee.

Ethics declarations

Conflict of interest

The authors declare they have no conflict of interest.

Additional information

Publisher's note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Appendices

Appendix A

All postures used in the categorization task.

figure f
figure g

Appendix B

See Appendix Table 4.

Table 4 Results of the categorization task in all three conditions

Appendix C

Statistical analysis using Shannon entropy and the Hellinger coefficient.

The Hellinger integral was introduced by Ernst Hellinger in 1909. This coefficient is used to quantify the similarity between two distributions. For two discrete probability distributions, \(P=({p}_{1},\dots ,{p}_{k})\) and \(Q=({q}_{1},\dots ,{q}_{k})\), their Hellinger distance is defined as

$$Hel(P,Q)=\sqrt{\frac{1}{2}\sum_{i=1}^{k}(\sqrt{{p}_{i}}-\sqrt{{q}_{i}}{)}^{2}}$$

In the present context of a categorization task, we use the Hellinger’s distance coefficient as an indication of the distance between the observed distribution and the uniform distribution, where the observed distribution refers to participants’ choices regarding a posture across the eight statements. The larger the Hellinger’s distance, the greater the difference between the observed distribution and the uniform distribution.

The concept of Shannon entropy finds more usage in information theory, where the message sent by a transmitter is modified when it passes through the channel, and the receiver attempts to infer the message that was sent. Shannon entropy gives the expected value (average) of the information contained in each message. “Messages” can be modelled by any flow of information. Shannon entropy was used to obtain a numeric value of the strength of appeal of each posture. The lesser the value of Shannon’s entropy, the clearer the message communicated by the posture.

Shannon defined the entropy, \(H\) (Greek capital letter eta), of a discrete probability distribution, \(P=({p}_{1},\dots ,{p}_{k})\), as

$$H(P)=-\sum_{i=1}^{k}{p}_{i}{\mathrm{log}}_{b}({p}_{i})$$

where b is the base of the logarithm used. Common values of b are 2, Euler’s number, e, and 10. The unit of entropy is Shannon for \(b=2\), nat for \(b=e\), and hartley for \(b=10\). In our case \(k=8\) (8 statements), and so we chose to work with \(b=8\) so that \(H\) is bounded within the [0, 1] interval. In the case of no information, where \({p}_{i}=1/8\)—a uniform distribution—\(H(P)\) will be 1, and in the case of full information, where \({p}_{i}=1\) for a single value of i and zero otherwise, \(H(P)\) will be 0.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Chatterjee, S., Parmet, Y. & Oron-Gilad, T. Body Language for Personal Robot Arm Assistant. Int J of Soc Robotics 14, 15–37 (2022). https://doi.org/10.1007/s12369-021-00748-y

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12369-021-00748-y

Keywords

Navigation