Skip to main content
Log in

Gesture-based guidance for navigation in virtual environments

  • Original Paper
  • Published:
Journal on Multimodal User Interfaces Aims and scope Submit manuscript

Abstract

Different navigation aids (text, audio, maps, arrows, etc.) are used in complex virtual environments (VEs) to assist users in task operation and performance enhancement. Most current studies use cognitive cues to assist users during navigation and path selection in VEs. However, novices need to know which gesture to execute to carry out specific navigation. In this paper, a new concept of navigation aid is proposed that uses visual aids with gestural interaction during navigation tasks. The proposed aids provide two-fold guidance in the VE: they assist users in selecting the correct path and posing the correct gestures. In addition, it proposes fingertip pointing-based gestures for realistic navigation inside the VE to achieve high performance and usability using simple and lightweight gestures. Furthermore, the proposed aids (gesture guides) are compared with existing aids such as audio, textual, arrow-casting + textual, and 3D map + textual aids in terms of performance and usability. A VE is designed and implemented using OpenGL for experimental purposes. The Leap Motion controller is used for hand gesture recognition and interaction with VE. The System Usability Scale (SUS) is used for assessing system usability. Experimental results show comparatively improved performance and high usability for the proposed aids as compared to audio, textual, arrow-casting + textual, and 3D map + textual aids.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12

Similar content being viewed by others

References

  1. Rehman IU, Ullah S, Khan D, Khalid S, Alam A, Jabeen G, Rabbi I, Rahman HU, Ali N, Azher M et al (2020) Fingertip gestures recognition using leap motion and camera for interaction with virtual environment. Electronics 9(12):1986

    Article  Google Scholar 

  2. Rehman I, Ullah S, Raees M (2019) Two hand gesture based 3d navigation in virtual environments. Science 5:996

    Google Scholar 

  3. Bonome YG, Mondéjar AG, de Oliveira RC, de Albuquerque E, Raposo A (2018) Design and assessment of two handling interaction techniques for 3d virtual objects using the myo armband. In: International Conference on Virtual, Augmented and Mixed Reality, pp 30–42. Springer

  4. Saffer D (2008) Designing gestural interfaces: touchscreens and interactive devices. O’Reilly Media Inc., London

    Google Scholar 

  5. Roth W-M (2001) Gestures: their role in teaching and learning. Rev Educ Res 71(3):365–392

    Article  Google Scholar 

  6. Müller C, Bressem J, Ladewig SH (2013) 45 towards a grammar of gestures: a form-based view. In: Volume 1, pp 707–733. De Gruyter Mouton

  7. Madeo RCB, Lima CAM, Peres SM (2017) Studies in automated hand gesture analysis: an overview of functional types and gesture phases. Lang Resour Eval 51(2):547–579

    Article  Google Scholar 

  8. Bhowmick S (2021) Exploring body gestures for small object selection in dense environment in hmd vr for data visualization applications. In: 2021 IEEE conference on virtual reality and 3D user interfaces abstracts and workshops (VRW), pp 713–714. IEEE

  9. De Paolis LT, Vite ST, Castañeda MAP, Domínguez VF, Muscatello S, Hernández VAF (2021) An augmented reality platform with hand gestures-based navigation for applications in image-guided surgery: prospective concept evaluation by surgeons. Int J Hum-Comput Interaction 2:1–13

    Google Scholar 

  10. Bird JJ, Ekárt A, Faria DR (2020) British sign language recognition via late fusion of computer vision and leap motion with transfer learning to american sign language. Sensors 20(18):5151

    Article  Google Scholar 

  11. Yohannan DG, Oommen AM, Amogh BJ, Raju NK, Suresh RO, Nair SJ (2021) ‘Air anatomy’-teaching complex spatial anatomy using simple hand gestures. Anat Sci Educ 6:9005

    Google Scholar 

  12. Raees M, Ullah S, Ur Rehman I, Azhar M (2021) Thumb inclination-based manipulation and exploration, a machine learning based interaction technique for virtual environments. Mehran Univ Res J Eng Technol 40(2):358–370

    Article  Google Scholar 

  13. Inam UR, Sehat U (2022) Gestures and marker based low-cost interactive writing board for primary education. Multimedia Tools Appl 81(1):1337–1356

    Article  Google Scholar 

  14. Ullah S, ur Rahman I, ur Rahman S (2019) Systematic augmentation of artoolkit markers for indoor navigation and guidance: systematic augmentation of artoolkit markers for indoor navigation and guidance. Proc Pakistan Acad Sci A. Phys Comput Sci 56(1):1–8

    Google Scholar 

  15. Khan D, Rehman I, Ullah S, Ahmad W, Cheng Z, Jabeen G, Kato H (2019) A low-cost interactive writing board for primary education using distinct augmented reality markers. Sustainability 11(20):5720

    Article  Google Scholar 

  16. Tsai W-T, Lee I-J, Chen C-H (2021) Inclusion of third-person perspective in cave-like immersive 3d virtual reality role-playing games for social reciprocity training of children with an autism spectrum disorder. Univ Access Inf Soc 20:375–389

    Article  Google Scholar 

  17. Escalera S, Athitsos V, Guyon I (2017) Challenges in multi-modal gesture recognition. Gesture Recogn 5:1–60

    Google Scholar 

  18. Dardas N, Hasan A-Q (2012) Real-time hand gesture detection and recognition for human computer interaction. PhD thesis, Université d’Ottawa/University of Ottawa

  19. Choi J-H, Ko N-Y, Ko D-Y (2001) Morphological gesture recognition algorithm. In: Proceedings of IEEE Region 10 international conference on electrical and electronic technology. TENCON 2001 (Cat. No. 01CH37239), vol 1, pp 291–296. IEEE

  20. De Smedt Q (2017) Dynamic hand gesture recognition-From traditional handcrafted to recent deep learning approaches. PhD thesis, Université de Lille 1, Sciences et Technologies; CRIStAL UMR 9189

  21. Jaimes A, Sebe N (2007) Multimodal human-computer interaction: a survey. Comput Vis Image Underst 108(1–2):116–134

    Article  Google Scholar 

  22. Turk M (2014) Multimodal interaction: a review. Pattern Recogn Lett 36:189–195

    Article  Google Scholar 

  23. Augstein M, Neumayr T (2019) A human-centered taxonomy of interaction modalities and devices. Interact Comput 31(1):27–58

    Article  Google Scholar 

  24. Alghofaili R, Sawahata Y, Huang H, Wang H-C, Shiratori T, Yu L-F (2019) Lost in style: gaze-driven adaptive aid for vr navigation. In: Proceedings of the 2019 CHI conference on human factors in computing systems, pp 1–12

  25. Möller A, Kranz M, Diewald S, Roalter L, Huitl R, Stockinger T, Koelle M, Lindemann PA (2014) Experimental evaluation of user interfaces for visual indoor navigation. In: Proceedings of the SIGCHI conference on human factors in computing systems, pp 3607–3616

  26. Khalid S, Ullah S, Ali N, Alam A, Rabbi I, Rehman IU, Azhar M (2019) Navigation aids in collaborative virtual environments: comparison of 3dml, audio, textual, arrows-casting. IEEE Access 7:152979–152989

    Article  Google Scholar 

  27. Darken RP, Peterson B (2001) Spatial orientation. Wayfinding, and representation

  28. Yang LI, Huang J, Feng TIAN, Hong-An WANG, Guo-Zhong DAI (2019) Gesture interaction in virtual reality. Virtual Reality Intell Hard 1(1):84–112

    Article  Google Scholar 

  29. Di Luca M, Seifi H, Egan S, Gonzalez-Franco M (2021) Locomotion vault: the extra mile in analyzing vr locomotion techniques. In: Proceedings of the 2021 CHI conference on human factors in computing systems, pp 1–10

  30. Kim J-S, Park K-H, Kim J-B, Do J-H, Song K-J, Bien Z (2000) Study on intelligent autonomous navigation of avatar using hand gesture recognition. In: Smc 2000 conference proceedings. 2000 ieee international conference on systems, man and cybernetics.’Cybernetics evolving to systems, humans, organizations, and their complex interactions’(cat. no. 0), vol 2, pp 846–851. IEEE

  31. Chen Q, Rahman ASMM, El-Sawah A, Shen X, El Saddik A, Georganas ND, Discover M (2006) Accessing learning objects in virtual environment by hand gestures and voice. In: Proceedings of 3rd annual scientific conference of LORNET research network (I2LOR-06). Citeseer

  32. Rehman IU, Ullah S, Rabbi I (2014) The effect of semantic multi-modal aids using guided virtual assembly environment. In: 2014 international conference on open source systems and technologies, pp 87–92. IEEE

  33. Rehman IU, Ullah S, Rabbi I (2014) Measuring the student’s success rate using a constraint based multi-modal virtual assembly environment. In: International conference on augmented and virtual reality, pp 53–64. Springer

  34. Rehman I, Ullah S (2016) The effect of constraint based multi-modal virtual assembly on student’s learning. Sindh Univ Res J SURJ 48(1):889

    Google Scholar 

  35. Ur Rehman I, Ullah S, Khan D (2020) Multi layered multi task marker based interaction in information rich virtual environments. Int J Interact Multimedia Artifi Intell 6(4):5524

    Google Scholar 

  36. Cabral MC, Morimoto CH, Zuffo MK (2005) On the usability of gesture interfaces in virtual reality environments. In: Proceedings of the 2005 Latin American conference on human-computer interaction, pp 100–108

  37. Lee YS, Sohn B-S (2018) Immersive gesture interfaces for navigation of 3d maps in hmd-based mobile virtual environments. Mobile Inform Syst 2:80056

    Google Scholar 

  38. Manghisi Vito M, Uva Antonio E, Michele F, Michele G, Antonio B, Giuseppe M (2018) Enhancing user engagement through the user centric design of a mid-air gesture-based interface for the navigation of virtual-tours in cultural heritage expositions. J Cult Herit 32:186–197

    Article  Google Scholar 

  39. Dias P, Parracho J, Cardoso J, Ferreira BQ, Ferreira C, Santos BS (2015) Developing and evaluating two gestural-based virtual environment navigation methods for large displays. In: International conference on distributed, ambient, and pervasive interactions, pp 141–151. Springer

  40. Vultur OM, Pentiuc SG, Ciupu A (2012) Navigation system in a virtual environment by gestures. In: 2012 9th international conference on communications (COMM), pp 111–114. IEEE

  41. Vultur O-M, Pentiuc S-G, Lupu V (2016) Real-time gestural interface for navigation in virtual environment. In: 2016 international conference on development and application systems (DAS), pp 303–307. IEEE

  42. Shao L (2016) Hand movement and gesture recognition using leap motion controller. Virtual Reality, Course Report

  43. Khundam C (2015) First person movement control with palm normal and hand gesture interaction in virtual reality. In: 2015 12th international joint conference on computer science and software engineering (JCSSE), pp 325–330. IEEE

  44. Fanini B (2014) A 3d interface to explore and manipulate multi-scale virtual scenes using the leap motion controller. In: ACHI 2014, the seventh international conference on advances in computer-human interactions, pp 258–263. Citeseer

  45. Raees Muhammad A, Sehat U (2019) Gift: gesture-based interaction by fingers tracking, an interaction technique for virtual environment. IJIMAI 5(5):115–125

    Article  Google Scholar 

  46. Bau O, Mackay WE (2008) Octopocus: a dynamic guide for learning gesture-based command sets. In: Proceedings of the 21st annual ACM symposium on User interface software and technology, pp 37–46

  47. Freeman D, Benko H, Morris MR, Wigdor D (2009) Shadowguides: visualizations for in-situ learning of multi-touch and whole-hand gestures. In: Proceedings of the ACM international conference on interactive tabletops and surfaces, pp 165–172

  48. Anderson F, Grossman T, Matejka J, Fitzmaurice G (2013) Youmove: enhancing movement training with an augmented reality mirror. In: Proceedings of the 26th annual ACM symposium on User interface software and technology, pp 311–320

  49. Rovelo G, Degraen D, Vanacken D, Luyten K, Coninx K (2015) Gestu-wan-an intelligible mid-air gesture guidance system for walk-up-and-use displays. In: IFIP conference on human-computer interaction, pp 368–386. Springer

  50. Sodhi R, Benko H, Wilson A (2012) Lightguide: projected visualizations for hand movement guidance. In: Proceedings of the SIGCHI conference on human factors in computing systems, pp 179–188

  51. Delamare W, Janssoone T, Coutrix C, Nigay L (2016) Designing 3d gesture guidance: visual feedback and feedforward design options. In: Proceedings of the international working conference on advanced visual interfaces, pp 152–159

  52. McKenzie G, Klippel A (2016) The interaction of landmarks and map alignment in you-are-here maps. Cartogr J 53(1):43–54

    Article  Google Scholar 

  53. Caduff D, Timpf S (2008) On the assessment of landmark salience for human navigation. Cogn Process 9(4):249–267

    Article  Google Scholar 

  54. Khan N, Ur Rahman A (2018) Rethinking the mini-map: a navigational aid to support spatial learning in urban game environments. Int J Hum-Comput Interact 34(12):1135–1147

    Article  Google Scholar 

  55. Nguyen TTH, Duval T, Fleury C (2013) Guiding techniques for collaborative exploration in multi-scale shared virtual environments. In: GRAPP international conference on computer graphics theory and applications, pp 327–336

  56. Monahan T, McArdle G, Bertolotto M (2008) Virtual reality for collaborative e-learning. Comput Educ 50(4):1339–1353

    Article  Google Scholar 

  57. Hung CC, Chi YJ, Sarah S, Chang JM (2007) A desktop virtual reality earth motion system in astronomy education. J Educ Technol Soc 10(3):289–304

    Google Scholar 

  58. Sampaio A, Henriques P, Ferreira P (2006) Virtual reality technology applied in civil engineering education. Proc m-ICTE 4:889

    Google Scholar 

  59. Dimitra T, Nikol R, McLaren Bruce M, Niels P, Oliver S, Andreas H, Isabel B (2010) Extending a virtual chemistry laboratory with a collaboration script to promote conceptual learning. Int J Technol Enhanc Learn 2(1–2):91–110

    Google Scholar 

  60. Sayers HM, Wilson S, McNeill MDJ (2004) Navigational tools for desktop virtual environment interfaces. Virtual Reality 7(3–4):131–139

    Google Scholar 

  61. Chapanis A (1975) Interactive human communication. Sci Am 232(3):36–46

    Article  Google Scholar 

  62. Bowers J, Pycock J, O’brien J (1996) Talk and embodiment in collaborative virtual environments. In: Proceedings of the SIGCHI conference on human factors in computing systems, pp 58–65

  63. Khalid S, Ullah S, Ali N, Alam A, Rasheed N, Fayaz M, Ahmad M (2020) The effect of combined aids on users performance in collaborative virtual environments. Multimedia Tools Appl 6:1–21

    Google Scholar 

  64. Brooke J et al (1996) Sus-a quick and dirty usability scale. Usability Eval Ind 189(194):4–7

    Google Scholar 

  65. Cohen J (1988) Statistical power analysis for the behavioural sciences. hillsdale, nj: Laurence erlbaum associates

  66. Borji A, Itti L (2012) State-of-the-art in visual attention modeling. IEEE Trans Pattern Anal Mach Intell 35(1):185-207

    Article  Google Scholar 

  67. Carrasco M (2011) Visual attention: the past 25 years. Vision Res 51(13):1484–1525

    Article  Google Scholar 

  68. Meade Melissa E, Meade John G, Hélène S, Fernandes Myra A (2019) Active navigation in virtual environments benefits spatial memory in older adults. Brain Sci 9(3):47

  69. Rothe S, Buschek D, Hußmann H (2019) Guidance in cinematic virtual reality-taxonomy, research status and challenges. Multimodal Technol Interact 3(1):19

    Article  Google Scholar 

  70. Tao Y, Ganz A (2020) Simulation framework for evaluation of indoor navigation systems. IEEE Access 8:20028–20042

    Article  Google Scholar 

  71. Sehat U, Numan A, Ur RS (2016) The effect of procedural guidance on students’ skill enhancement in a virtual chemistry laboratory. J Chem Educ 93(12):2018–2025

    Article  Google Scholar 

  72. Chen J et al. (2003) Effective interaction techniques in information-rich virtual environments. In: Proceedings of the Young investigator’s forum in virtual reality (YoungVR), Seoul, South Korea

  73. Inamura T, Mizuchi Y (2019) Robot competition to evaluate guidance skill for general users in vr environment. In: 2019 14th ACM/IEEE international conference on human-robot interaction (HRI), pp 552–553. IEEE

  74. Montuwy A, Dommes A, Cahour B (2019) Helping older pedestrians navigate in the city: comparisons of visual, auditory and haptic guidance instructions in a virtual environment. Behav Inform Technol 38(2):150–171

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Inam Ur Rehman.

Ethics declarations

Conflict of interest

The authors declare that they have no conflict of interest.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Rehman, I.U., Ullah, S., Ali, N. et al. Gesture-based guidance for navigation in virtual environments. J Multimodal User Interfaces 16, 371–383 (2022). https://doi.org/10.1007/s12193-022-00395-1

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12193-022-00395-1

Keywords