Skip to main content

AR Point &Click: An Interface forĀ Setting Robot Navigation Goals

  • Conference paper
  • First Online:
Social Robotics (ICSR 2022)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 13817))

Included in the following conference series:

Abstract

This paper considers the problem of designating navigation goal locations for interactive mobile robots. We investigate a point-and-click interface, implemented with an Augmented Reality (AR) headset. The cameras on the AR headset are used to detect natural pointing gestures performed by the user. The selected goal is visualized through the AR headset, allowing the users to adjust the goal location if desired. We conduct a user study in which participants set consecutive navigation goals for the robot using three different interfaces: AR Point &Click, Person Following and Tablet (birdeye map view). Results show that the proposed AR Point &Click interface improved the perceived accuracy, efficiency and reduced mental load compared to the baseline tablet interface, and it performed on-par to the Person Following method. These results show that the AR Point &Click is a feasible interaction model for setting navigation goals.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 79.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 99.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    Due to the COVID-19 pandemic, no external participants could be recruited. This study has been approved by the Monash University Human Research Ethics Committee (Application ID: 27685)

References

  1. USD 35.27 billion growth in service robotics market: by application (professional robots and personal robots) and Geography - Global Forecast to 2025. https://www.prnewswire.com/news-releases/usd-35-27-billion-growth-in-service-robotics-market-by-application-professional-robots-and-personal-robots-and-geography--global-forecast-to-2025-301454915.html, 24 Mar 2022

  2. Vuforia Developer Portal. https://developer.vuforia.com/

  3. Chen, Y.H., Zhang, B., Tuna, C., Li, Y., Lee, E.A., Hartmann, B.: A context menu for the real world: controlling physical appliances through head-worn infrared targeting. Technical Report, UC Berkeley EECS (2013)

    Google ScholarĀ 

  4. Cosgun, A., Christensen, H.I.: Context-aware robot navigation using interactively built semantic maps. Paladyn, J. Behav. Robot. 9(1), 254ā€“276 (2018)

    ArticleĀ  Google ScholarĀ 

  5. Cosgun, A., Florencio, D.A., Christensen, H.I.: Autonomous person following for telepresence robots. In: IEEE International Conference on Robotics and Automation (ICRA) (2013)

    Google ScholarĀ 

  6. Cosgun, A., Maliki, A., Demir, K., Christensen, H.: Human-centric assistive remote control for co-located mobile robots. In: ACM/IEEE International Conference on Human-Robot Interaction (HRI) Extended Abstracts, pp. 27ā€“28 (2015)

    Google ScholarĀ 

  7. Cosgun, A., Sisbot, E.A., Christensen, H.I.: Anticipatory robot path planning in human environments. In: IEEE international Symposium on Robot and Human Interactive Communication (RO-MAN) (2016)

    Google ScholarĀ 

  8. Cosgun, A., Trevor, A.J., Christensen, H.I.: Did you mean this object?: detecting ambiguity in pointing gesture targets. In: HRIā€™15 Towards a Framework for Joint Action Workshop (2015)

    Google ScholarĀ 

  9. Gu, M., Cosgun, A., Chan, W.P., Drummond, T., Croft, E.: Seeing thru walls: visualizing mobile robots in augmented reality. In: IEEE International Conference on Robot and Human Interactive Communication (RO-MAN) (2021)

    Google ScholarĀ 

  10. Gualtieri, M., Kuczynski, J., Shultz, A.M., Ten Pas, A., Platt, R., Yanco, H.: Open world assistive grasping using laser selection. In: IEEE International Conference on Robotics and Automation (ICRA) (2017)

    Google ScholarĀ 

  11. Hato, Y., Satake, S., Kanda, T., Imai, M., Hagita, N.: Pointing to space: modeling of deictic interaction referring to regions. In: ACM/IEEE International Conference on Human-Robot Interaction (HRI) (2010)

    Google ScholarĀ 

  12. Hoang, K.C., Chan, W.P., Lay, S., Cosgun, A., Croft, E.: virtual barriers in augmented reality for safe and effective human-robot cooperation in manufacturing. In: IEEE International Conference on Robot and Human Interactive Communication (RO-MAN) (2022)

    Google ScholarĀ 

  13. Hoang, K.C., Chan, W.P., Lay, S., Cosgun, A., Croft, E.: Visualizing robot intent for object handovers with augmented reality. In: IEEE International Conference on Robot and Human Interactive Communication (RO-MAN) (2022)

    Google ScholarĀ 

  14. Hoang, K.C., Chan, W.P., Lay, S., Cosgun, A., Croft, E.: Arviz: an augmented reality-enabled visualization platform for ROS applications. IEEE Robot. Autom. Mag. 29, 2ā€“11 (2022)

    ArticleĀ  Google ScholarĀ 

  15. Jafari, M., Ansari-Pour, N.: Why, when and how to adjust your p values? Cell J. 20, 604ā€“607 (2019)

    Google ScholarĀ 

  16. Kemp, C.C., Anderson, C.D., Nguyen, H., Trevor, A.J., Xu, Z.: A point-and-click interface for the real world: laser designation of objects for mobile manipulation. In: ACM/IEEE International Conference on Human-Robot Interaction (HRI) (2008)

    Google ScholarĀ 

  17. Kousi, N., Stoubos, C., Gkournelos, C., Michalos, G., Makris, S.: Enabling Human Robot Interaction in flexible robotic assembly lines: an augmented reality based software suite. In: Procedia CIRP (2019)

    Google ScholarĀ 

  18. Makhataeva, Z., Varol, H.A.: Augmented reality for robotics: a review. Robotics 9(2), 21 (2020)

    ArticleĀ  Google ScholarĀ 

  19. Newbury, R., Cosgun, A., Koseoglu, M., Drummond, T.: Learning to take good pictures of people with a robot photographer. In: IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) (2020)

    Google ScholarĀ 

  20. Nguyen, H., Jain, A., Anderson, C., Kemp, C.C.: A clickable world: behavior selection through pointing and context for mobile manipulation. In: IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) (2008)

    Google ScholarĀ 

  21. Reardon, C., Lee, K., Fink, J.: Come see this! Augmented reality to enable human-robot cooperative search. In: 2018 IEEE International Symposium on Safety, Security, and Rescue Robotics (SSRR) (2018)

    Google ScholarĀ 

  22. Rottmann, N., Studt, N., Ernst, F., Rueckert, E.: ROS-mobile: an android application for the robot operating system. arXiv preprint arXiv:2011.02781 (2020)

  23. Scales, P., Aycard, O., AubergƩ, V.: Studying navigation as a form of interaction: a design approach for social robot navigation methods. In: IEEE International Conference on Robotics and Automation (ICRA) (2020)

    Google ScholarĀ 

  24. Sprute, D., Tƶnnies, K., Kƶnig, M.: This far, no further: introducing virtual borders to mobile robots using a laser pointer. In: IEEE International Conference on Robotic Computing (IRC), pp. 403ā€“408 (2019)

    Google ScholarĀ 

  25. Walker, M., Hedayati, H., Lee, J., Szafir, D.: Communicating robot motion intent with augmented reality. In: ACM/IEEE International Conference on Human-Robot Interaction (HRI) (2018)

    Google ScholarĀ 

  26. waymouth, b., et al.: demonstrating cloth folding to robots: Design and evaluation of a 2d and a 3d user interface. In: IEEE International Conference on Robot and Human Interactive Communication (RO-MAN) (2021)

    Google ScholarĀ 

Download references

Acknowledgement

This project was supported by the Australian Research Council (ARC) Discovery Project Grant DP200102858.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Morris Gu .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

Ā© 2022 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Gu, M., Croft, E., Cosgun, A. (2022). AR Point &Click: An Interface forĀ Setting Robot Navigation Goals. In: Cavallo, F., et al. Social Robotics. ICSR 2022. Lecture Notes in Computer Science(), vol 13817. Springer, Cham. https://doi.org/10.1007/978-3-031-24667-8_4

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-24667-8_4

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-24666-1

  • Online ISBN: 978-3-031-24667-8

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics