Skip to main content

Designing an Expressive Head for a Help Requesting Socially Assistive Robot

  • Conference paper
  • First Online:
Human-Friendly Robotics 2019 (HFR 2019)

Abstract

In this paper, we present the developments regarding an expressive robot head for our socially assistive mobile robot HERA, which among other things is serving as an autonomous delivery system in public buildings. One aspect of that task is contacting and interacting with unconcerned people in order get help when doors are to open or an elevator has to be used. We designed and tested a robot head comprising a pan-tilt unit, 3D-printed shells, animated eyes displayed on two LCD-screens, and three arrays of RGB-LEDs for communicating internal robot states and attracting potential helpers’ interest. An online-study was performed to compare variations of eye-expression and LED lighting. Data was extracted from the answers of 139 participants. Statistical analysis showed significant differences in identification performance for our intended eye-expressions, perceived politeness, help intentions, and hedonic user experience.

This project is funded by the German Federal Ministry of Education and Research (BMBF) within FRAME (16SV7829K).

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 169.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 219.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 219.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    Head and eyes movement: https://youtu.be/PHBMrr7HQzI.

  2. 2.

    Eye emotions and blink: https://youtu.be/XrsamLVvKO8.

References

  1. FRAME Project Website: Neuroinformatics and Cognitive Robotics Lab, Ilmenau University of Technology. http://www.frame-projekt.de. (in German)

  2. Backhaus, N., Rosen, P.H., Scheidig, A., Gross, H.M., Wischniewski, S.: Help me, please?! Interaction design framework for needy mobile service robot. In: IEEE Workshop on Advanced Robotics and its Social Impacts (ARSO), pp. 54–61 (2018)

    Google Scholar 

  3. Liebner, J., Scheidig, A., Gross, H.M.: Now I need help! passing doors and using elevators as an assistance requiring robot. In: International Conference on Social Robotics (ICSR). Springer (2019)

    Google Scholar 

  4. Song, S., Yamada, S.: Expressing emotions through color, sound, and vibration with an appearance-constrained social robot. In: Proceedings of the 2017 ACM/IEEE International Conference on Human-Robot Interaction, pp. 2–11. ACM (2017)

    Google Scholar 

  5. Baraka, K., Veloso, M.M.: Mobile service robot state revealing through expressive lights: formalism, design, and evaluation. Int. J. Social Robot. 10(1), 65–92 (2018)

    Article  Google Scholar 

  6. Bennett, C.C., Šabanović, S.: Deriving minimal features for human-like facial expressions in robotic faces. Int. J. Social Robot. 6(3), 367–381 (2014)

    Article  Google Scholar 

  7. Ernest-Jones, M., Nettle, D., Bateson, M.: Effects of eye images on everyday cooperative behavior: a field experiment. Evol. Hum. Behav. 32(3), 172–178 (2011)

    Article  Google Scholar 

  8. Breazeal, C.: Emotion and sociable humanoid robots. Int. J. Hum Comput Stud. 59(1–2), 119–155 (2003)

    Article  Google Scholar 

  9. Marsh, A.A., Ambady, N., Kleck, R.E.: The effects of fear and anger facial expressions on approach-and avoidance-related behaviors. Emotion 5(1), 119 (2005)

    Article  Google Scholar 

  10. Lee, H.R., Šabanović, S., Stolterman, E.: How humanlike should a social robot be: a user-centered exploration. In: 2016 AAAI Spring Symposium Series (2016)

    Google Scholar 

  11. Mathur, M.B., Reichling, D.B.: Navigating a social world with robot partners: a quantitative cartography of the uncanny valley. Cognition 146, 22–32 (2016)

    Article  Google Scholar 

  12. Gross, H.-M., Boehme, H.-J., Schroeter, C., Mueller, S., Koenig, A., Martin, C., Merten, M., Bley, A.: ShopBot: progress in developing an interactive mobile shopping assistant for everyday use. In: IEEE International Conference on Systems, Man, and Cybernetics (SMC). IEEE, pp. 3471–3478 (2008)

    Google Scholar 

  13. Gross, H.-M., Mueller, S., Schroeter, C., Volkhardt, M., Scheidig, A., Debes, K., Richter, K., Doering, N.: Robot companion for domestic health assistance: Implementation, test and case study under everyday conditions in private apartments. In: IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 5992–5999. IEEE (2015)

    Google Scholar 

  14. ISO 60204–1:2016: Safety of machinery - Electrical equipment of machines - Part 1: General requirements. International Organization for Standardization, Geneva, CH, Standard (2016)

    Google Scholar 

  15. Volkhardt, M., Weinrich, Ch., Gross, H.-M.: Multi-modal people tracking on a mobile companion robot. In: European Conference on Mobile Robots (ECMR) (2013)

    Google Scholar 

  16. Vorndran, A., Trinh, T.Q., Mueller, S., Scheidig, A., Gross, H.M.: How to always keep an eye on the user with a mobile robot? In: International Symposium on Robotics (ISR), Munich, Germany. VDE Verlag, pp. 219–225 (2018)

    Google Scholar 

  17. Dautenhahn, K., Walters, M., Woods, S., Koay, K.L., Nehaniv, C.L., Sisbot, A., Alami, R., Siméon, T.: How may I serve you?: A robot companion approaching a seated person in a helping context. In: Proceedings of the 1st ACM SIGCHI/SIGART Conference on Human-robot Interaction. ACM, pp. 172–179 (2006)

    Google Scholar 

  18. Sosnowski, S., Bittermann, A., Kuhnlenz, K., Buss, M.: Design and evaluation of emotion-display EDDIE. In: 2006 IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 3113–3118. IEEE (2006)

    Google Scholar 

  19. Schrepp, M., Hinderks, A., Thomaschewski, J.: Design and evaluation of a short version of the user experience questionnaire (UEQ-S). IJIMAI 4(6), 103–108 (2017)

    Article  Google Scholar 

  20. Salem, M., Ziadee, M., Sakr, M.: Marhaba, how may I help you? Effects of politeness and culture on robot acceptance and anthropomorphization. In: 2014 9th ACM/IEEE International Conference on Human-Robot Interaction (HRI). IEEE, pp. 74–81 (2014)

    Google Scholar 

  21. Pavey, L., Greitemeyer, T., Sparks, P.: I help because I want to, not because you tell me to empathy increases autonomously motivated helping. Pers. Soc. Psychol. Bull. 38(5), 681–689 (2012)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Tim van der Grinten .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2020 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

van der Grinten, T., Müller, S., Westhoven, M., Wischniewski, S., Scheidig, A., Gross, HM. (2020). Designing an Expressive Head for a Help Requesting Socially Assistive Robot. In: Ferraguti, F., Villani, V., Sabattini, L., Bonfè, M. (eds) Human-Friendly Robotics 2019. HFR 2019. Springer Proceedings in Advanced Robotics, vol 12. Springer, Cham. https://doi.org/10.1007/978-3-030-42026-0_7

Download citation

Publish with us

Policies and ethics