Skip to main content

Intention Understanding for Human-Aware Mobile Robots: Comparing Cues and the Effect of Demographics

  • Conference paper
  • First Online:
Computer Vision, Imaging and Computer Graphics Theory and Applications (VISIGRAPP 2020)

Abstract

Mobile robots are becoming more and more ubiquitous in our everyday living environments. Therefore, it is very important that people can easily interpret what the robot’s intentions are. This is especially important when a robot is driving down a crowded corridor. It is essential for people in its vicinity to understand which way the robot wants to go next. To explore what signals are the best for conveying its intention to turn, we implemented three lighting schemes and tested them out in an online experiment. We found that signals resembling automotive signaling work the best also for logistic mobile robots. We further find that people’s opinion of these signaling methods will be influenced by their demographic background (gender, age).

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    See a video of the implemented conditions here https://youtu.be/J6jtDH6ZSuw.

References

  1. Riek, L.D.: Healthcare robotics. Commun. ACM 60, 68–78 (2017)

    Article  Google Scholar 

  2. Bodenhagen, L., Suvei, S.-D., Juel, W.K., Brander, E., Krüger, N.: Robot technology for future welfare: meeting upcoming societal challenges–an outlook with offset in the development in Scandinavia. Health Technol. (Berl) 9(3), 197–218 (2019)

    Article  Google Scholar 

  3. Palinko, O., Ramirez, E.R., Juel, W.K., Krüger, N., Bodenhagen, L.: Intention indication for human aware robot navigation. In: VISIGRAPP 2020 - Proceedings of the 15th International Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications (2020)

    Google Scholar 

  4. Svenstrup, M., Tranberg, S., Andersen, H.J., Bak, T.: Pose estimation and adaptive robot behaviour for human-robot interaction. In: Proceedings - IEEE International Conference on Robotics and Automation (2009)

    Google Scholar 

  5. Hameed, I.A., Tan, Z.-H., Thomsen, N.B., Duan, X.: User acceptance of social robots. In: Proceedings of the Ninth International Conference on Advances in Computer-Human Interactions (ACHI 2016), Venice, Italy, pp. 274–279 (2016)

    Google Scholar 

  6. Beer, W.A., Prakash, J. M., Mitzner, A., Rogers, T.L.: Understanding robot acceptance. Georg. Inst. Technol. (2011)

    Google Scholar 

  7. Kilner, J.M.: More than one pathway to action understanding. Trends Cogn. Sci. 15(8), 352–357 (2011)

    Article  Google Scholar 

  8. Castiello, U.: Understanding other people’s actions: intention and attention. J. Exp. Psychol. Hum. Percept. Perform. 29, 416 (2003)

    Article  Google Scholar 

  9. Ansuini, C., Giosa, L., Turella, L., Altoè, G., Castiello, U.: An object for an action, the same object for other actions: effects on hand shaping. Exp. Brain Res. 185(1), 111–119 (2008)

    Article  Google Scholar 

  10. Gielniak, M.J., Thomaz, A. L.: Generating anticipation in robot motion. In: Proceedings - IEEE International Workshop on Robot and Human Interactive Communication (2011)

    Google Scholar 

  11. Duarte, N.F., Raković, M., Tasevski, J., Coco, M.I., Billard, A., Santos-Victor, J.: Action anticipation: reading the intentions of humans and robots. IEEE Robot. Autom. Lett. 3(4), 4132–4139 (2018)

    Article  Google Scholar 

  12. Coovert, M.D., Lee, T., Shindev, I., Sun, Y.: Spatial augmented reality as a method for a mobile robot to communicate intended movement. Comput. Human Behav. 34, 241–248 (2014)

    Article  Google Scholar 

  13. Chadalavada, R.T., Andreasson, H., Krug, R., Lilienthal, A.J.: That’s on my mind! Robot to human intention communication through on-board projection on shared floor space (2016)

    Google Scholar 

  14. Pörtner, A., Schröder, L., Rasch, R., Sprute, D., Hoffmann, M., König, M.: The power of color: a study on the effective use of colored light in human-robot interaction. In: IEEE International Conference on Intelligent Robots and Systems (2018)

    Google Scholar 

  15. Baraka, K., Veloso, M.M.: Mobile service robot state revealing through expressive lights: formalism, design, and evaluation. Int. J. Soc. Robot. 10, 65–92 (2018)

    Article  Google Scholar 

  16. Szafir, D., Mutlu, B., Fong, T.: Communicating directionality in flying robots. In: ACM/IEEE International Conference on Human-Robot Interaction (2015)

    Google Scholar 

  17. Hart, J., et al.: Unclogging our arteries: using human-inspired signals to disambiguate navigational intentions. arXiv Preprint arXiv:1909.06560 (2019)

  18. Mutlu, B., Shiwa, T., Kanda, T., Ishiguro, H., Hagita, N.: Footing in human-robot conversations: how robots might shape participant roles using gaze cues. Hum. Factors 2(1), 61–68 (2009)

    Google Scholar 

  19. Palinko, O., Fischer, K., Ruiz Ramirez, E., Damsgaard Nissen, L., Langedijk, R.M.: A drink-serving mobile social robot selects who to interact with using gaze. In: ACM/IEEE International Conference on Human-Robot Interaction (2020)

    Google Scholar 

  20. Dutzik, T., Inglis, J., Baxandall, P.: Millennials in motion: changing travel Habits of young Americans and the implications for public policy (2014)

    Google Scholar 

Download references

Acknowledgements

This work was supported by the project Health-CAT, funded by the European Regional Development Fund.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Oskar Palinko .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2022 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Palinko, O., Ramirez, E.R., Krüger, N., Bodenhagen, L. (2022). Intention Understanding for Human-Aware Mobile Robots: Comparing Cues and the Effect of Demographics. In: Bouatouch, K., et al. Computer Vision, Imaging and Computer Graphics Theory and Applications. VISIGRAPP 2020. Communications in Computer and Information Science, vol 1474. Springer, Cham. https://doi.org/10.1007/978-3-030-94893-1_4

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-94893-1_4

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-94892-4

  • Online ISBN: 978-3-030-94893-1

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics