Abstract
This paper considers the problem of designating navigation goal locations for interactive mobile robots. We investigate a point-and-click interface, implemented with an Augmented Reality (AR) headset. The cameras on the AR headset are used to detect natural pointing gestures performed by the user. The selected goal is visualized through the AR headset, allowing the users to adjust the goal location if desired. We conduct a user study in which participants set consecutive navigation goals for the robot using three different interfaces: AR Point &Click, Person Following and Tablet (birdeye map view). Results show that the proposed AR Point &Click interface improved the perceived accuracy, efficiency and reduced mental load compared to the baseline tablet interface, and it performed on-par to the Person Following method. These results show that the AR Point &Click is a feasible interaction model for setting navigation goals.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Notes
- 1.
Due to the COVID-19 pandemic, no external participants could be recruited. This study has been approved by the Monash University Human Research Ethics Committee (Application ID: 27685)
References
USD 35.27 billion growth in service robotics market: by application (professional robots and personal robots) and Geography - Global Forecast to 2025. https://www.prnewswire.com/news-releases/usd-35-27-billion-growth-in-service-robotics-market-by-application-professional-robots-and-personal-robots-and-geography--global-forecast-to-2025-301454915.html, 24 Mar 2022
Vuforia Developer Portal. https://developer.vuforia.com/
Chen, Y.H., Zhang, B., Tuna, C., Li, Y., Lee, E.A., Hartmann, B.: A context menu for the real world: controlling physical appliances through head-worn infrared targeting. Technical Report, UC Berkeley EECS (2013)
Cosgun, A., Christensen, H.I.: Context-aware robot navigation using interactively built semantic maps. Paladyn, J. Behav. Robot. 9(1), 254ā276 (2018)
Cosgun, A., Florencio, D.A., Christensen, H.I.: Autonomous person following for telepresence robots. In: IEEE International Conference on Robotics and Automation (ICRA) (2013)
Cosgun, A., Maliki, A., Demir, K., Christensen, H.: Human-centric assistive remote control for co-located mobile robots. In: ACM/IEEE International Conference on Human-Robot Interaction (HRI) Extended Abstracts, pp. 27ā28 (2015)
Cosgun, A., Sisbot, E.A., Christensen, H.I.: Anticipatory robot path planning in human environments. In: IEEE international Symposium on Robot and Human Interactive Communication (RO-MAN) (2016)
Cosgun, A., Trevor, A.J., Christensen, H.I.: Did you mean this object?: detecting ambiguity in pointing gesture targets. In: HRIā15 Towards a Framework for Joint Action Workshop (2015)
Gu, M., Cosgun, A., Chan, W.P., Drummond, T., Croft, E.: Seeing thru walls: visualizing mobile robots in augmented reality. In: IEEE International Conference on Robot and Human Interactive Communication (RO-MAN) (2021)
Gualtieri, M., Kuczynski, J., Shultz, A.M., Ten Pas, A., Platt, R., Yanco, H.: Open world assistive grasping using laser selection. In: IEEE International Conference on Robotics and Automation (ICRA) (2017)
Hato, Y., Satake, S., Kanda, T., Imai, M., Hagita, N.: Pointing to space: modeling of deictic interaction referring to regions. In: ACM/IEEE International Conference on Human-Robot Interaction (HRI) (2010)
Hoang, K.C., Chan, W.P., Lay, S., Cosgun, A., Croft, E.: virtual barriers in augmented reality for safe and effective human-robot cooperation in manufacturing. In: IEEE International Conference on Robot and Human Interactive Communication (RO-MAN) (2022)
Hoang, K.C., Chan, W.P., Lay, S., Cosgun, A., Croft, E.: Visualizing robot intent for object handovers with augmented reality. In: IEEE International Conference on Robot and Human Interactive Communication (RO-MAN) (2022)
Hoang, K.C., Chan, W.P., Lay, S., Cosgun, A., Croft, E.: Arviz: an augmented reality-enabled visualization platform for ROS applications. IEEE Robot. Autom. Mag. 29, 2ā11 (2022)
Jafari, M., Ansari-Pour, N.: Why, when and how to adjust your p values? Cell J. 20, 604ā607 (2019)
Kemp, C.C., Anderson, C.D., Nguyen, H., Trevor, A.J., Xu, Z.: A point-and-click interface for the real world: laser designation of objects for mobile manipulation. In: ACM/IEEE International Conference on Human-Robot Interaction (HRI) (2008)
Kousi, N., Stoubos, C., Gkournelos, C., Michalos, G., Makris, S.: Enabling Human Robot Interaction in flexible robotic assembly lines: an augmented reality based software suite. In: Procedia CIRP (2019)
Makhataeva, Z., Varol, H.A.: Augmented reality for robotics: a review. Robotics 9(2), 21 (2020)
Newbury, R., Cosgun, A., Koseoglu, M., Drummond, T.: Learning to take good pictures of people with a robot photographer. In: IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) (2020)
Nguyen, H., Jain, A., Anderson, C., Kemp, C.C.: A clickable world: behavior selection through pointing and context for mobile manipulation. In: IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) (2008)
Reardon, C., Lee, K., Fink, J.: Come see this! Augmented reality to enable human-robot cooperative search. In: 2018 IEEE International Symposium on Safety, Security, and Rescue Robotics (SSRR) (2018)
Rottmann, N., Studt, N., Ernst, F., Rueckert, E.: ROS-mobile: an android application for the robot operating system. arXiv preprint arXiv:2011.02781 (2020)
Scales, P., Aycard, O., AubergƩ, V.: Studying navigation as a form of interaction: a design approach for social robot navigation methods. In: IEEE International Conference on Robotics and Automation (ICRA) (2020)
Sprute, D., Tƶnnies, K., Kƶnig, M.: This far, no further: introducing virtual borders to mobile robots using a laser pointer. In: IEEE International Conference on Robotic Computing (IRC), pp. 403ā408 (2019)
Walker, M., Hedayati, H., Lee, J., Szafir, D.: Communicating robot motion intent with augmented reality. In: ACM/IEEE International Conference on Human-Robot Interaction (HRI) (2018)
waymouth, b., et al.: demonstrating cloth folding to robots: Design and evaluation of a 2d and a 3d user interface. In: IEEE International Conference on Robot and Human Interactive Communication (RO-MAN) (2021)
Acknowledgement
This project was supported by the Australian Research Council (ARC) Discovery Project Grant DP200102858.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
Ā© 2022 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
Gu, M., Croft, E., Cosgun, A. (2022). AR Point &Click: An Interface forĀ Setting Robot Navigation Goals. In: Cavallo, F., et al. Social Robotics. ICSR 2022. Lecture Notes in Computer Science(), vol 13817. Springer, Cham. https://doi.org/10.1007/978-3-031-24667-8_4
Download citation
DOI: https://doi.org/10.1007/978-3-031-24667-8_4
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-24666-1
Online ISBN: 978-3-031-24667-8
eBook Packages: Computer ScienceComputer Science (R0)