Abstract
Gesture-based interfaces offer a suitable platform for interactions in Virtual Environments (VE). However, the difficulties involved in learning and making of distinct gestures affect the performance of an interactive system. By incorporating computer vision in Virtual Reality (VR), this paper presents an intuitive interaction technique where the states and positions of eyes are traced for interaction. With comparatively low cognitive load, the technique offers an easy to use interaction solution for VR applications. Unlike other gestural interfaces, interactions are performed in distinct phases where transition from one phase to another is enacted with simple blink of eyes. In an attained phase, interaction along an arbitrary axis is performed by a perceptive gesture of head; rolling, pitching or yawing. To follow the trajectory of eyes in real time, coordinates mapping is performed dynamically. The proposed technique is implemented in a case-study project; EBI (Eyes Blinking based Interaction). In the EBI project, real time detection and tracking of eyes are performed at the back-end. At the front-end, virtual scene is rendered accordingly by using the OpenGL library. To assess accuracy, usability and cognitive load of the proposed technique, the EBI project is evaluated 280 times in three different evaluation sessions. With an ordinary camera, an average accuracy of 81.4% is achieved. However, assessment made by using a high-quality camera revealed that accuracy of the system could be raised to a higher level. As a whole, findings of the evaluations support applicability of the technique in the emerging domains of VR.
Similar content being viewed by others
References
Ackad C, Kay J, Tomitsch M (2014) Towards learnable gestures for exploring hierarchical information spaces at a large public display. In: CHI workshop on gesture-based interaction design, vol 49 , p 57
Adkar P (2013) Unimodal and multimodal human computer interaction: a modern overview. Int J Comput Sci Inf Eng Technol 2(3):1–8
Alqahtani AS, Daghestani LF, Ibrahim LF (2017) Environments and system types of virtual reality technology in STEM: a survey. International Journal of Advanced Computer Science and Applications (IJACSA), 8(6)
Alt F, Schneegass S, Auda J, Rzayev R, Broy N (2014) Using eye-tracking to support interaction with layered 3D interfaces on stereoscopic displays. In: Proceedings of the 19th international conference on intelligent user interfaces, pp 267–272
Atienza R, Blonna R, Saludares MI, Casimiro J, Fuentes V (2016) Interaction techniques using head gaze for virtual reality. In: IEEE region symposium (TENSYMP), pp 110–114
Benko H (2009) Beyond flat surface computing: challenges of depth-aware and curved interfaces. In: Proceedings of the 17th ACM international conference on multimedia, pp 935–944
Bergasa LM, Nuevo J, Sotelo MA, Barea R, Lopez ME (2006) Real-time system for monitoring driver vigilance. IEEE Trans Intell Transp Syst 7(1):63–77
Bolte B, Lappe M (2015) Subliminal reorientation and repositioning in immersive virtual environments using saccadic suppression. IEEE Trans Vis Comput Graph 21 (4):545–552
Bott NT, Lange A, Rentz D, Buffalo E, Clopton P, Zola S (2017) Web camera based eye tracking to assess visual memory on a visual paired comparison task. Frontiers in Neuroscience 11:370
Card SK (2014) A simple universal gesture scheme for user interfaces. In: Gesture-based interaction design: Communication and cognition, CHI workshop
Cashion J, Wingrave C, LaViola JJ (2012) Dense and dynamic 3D selection for game-based virtual environments. IEEE Trans Vis Comput Graph 18(4):634–642
Chandler P, Sweller J (1991) Cognitive load theory Instructionand the format of instruction. Cognition and Instruction 8(4):293–332
Chapoulie E (2014) Gestures and direct manipulation for immersive virtual reality, Doctoral dissertation
Chatterjee I, Xiao R, Harrison C (2015) Gaze+ gesture: Expressive, precise and targeted free-space interactions. In: Proceedings of the ACM on international conference on multimodal interaction, pp 131–138
Choi I, Han S, Kim D (2011) Eye detection and eye blink detection using adaboost learning and grouping. In: Proceedings of 20th international conference on computer communications and networks (ICCCN), pp 1–4
Dargham JA, Chekima A, Moung EG (2012) Fusing facial features for face recognition. In: Distributed computing and artificial intelligence, pp 565–572
De Luca A, Weiss R, Drewes H (2007) Evaluation of eye-gaze interaction methods for security enhanced PIN-entry. In: Proceedings of the 19th Australasian conference on computer-human interaction: Entertaining user interfaces, pp 199–202
De Smedt Q (2017) Dynamic hand gesture recognition-From traditional handcrafted to recent deep learning approaches, Doctoral dissertation, Université de Lille 1, Sciences et Technologies; CRIStAL UMR 9189
Deng S (2018) Multimodal interactions in virtual environments using eye tracking and gesture control. Doctoral dissertation, Bournemouth University
Di Stasi LL, Antolí A, Cañas JJ (2011) Main sequence: an index for detecting mental workload variation in complex tasks. Applied Ergonomics 42 (6):807–813
Duchowski AT (2002) A breadth-first survey of eye-tracking applications. Behavior Research Methods Instruments & Computers 34(4):455–470
Duguleană M, Nedelcu A, Bărbuceanu F (2014) Measuring eye gaze convergent distance within immersive virtual environments. Procedia Engineering 69:333–339
Esteves A, Velloso E, Bulling A, Gellersen H (2015) Orbits: Gaze interaction for smart watches using smooth pursuit eye movements. In: Proceedings of the 28th annual ACM symposium on user interface software & technology, pp 457–466
Fono D, Vertegaal R (2005) EyeWindows: Evaluation of eye-controlled zooming windows for focus selection. In: Proceedings of the SIGCHI conference on Human factors in computing systems, pp 151–160
Foulsham T (2015) Eye movements and their functions in everyday tasks. Eye 29(2):196
Franke IS, Günther T, Groh R (2014) Saccade detection and processing for enhancing 3D visualizations in real-time. In: International conference on human-computer interaction, pp 317– 322
Granholm E, Steinhauer S (2004) Pupillometric measures of cognitive and emotional processes. J Int Org Psych 52:1–6
Hales J, Rozado D, Mardanbegi D (2013) Interacting with objects in the environment by gaze and hand gestures. In: Proceedings of the 3rd international workshop on pervasive eye tracking and mobile eye-based interaction, pp 1–9
Han P, Saunders DR, Woods RL, Luo G (2013) Trajectory prediction of saccadic eye movements using a compressed exponential model. J Vis 13(8):27–27
Hansen DW, Ji Q (2010) In the eye of the beholder: a survey of models for eyes and gaze. IEEE Trans Pattern Anal Mach Intell 32(3):478–500
Hilsendeger A, Brandauer S, Tolksdorf J, Fröhlich C (2009) Navigation in virtual reality with the wii balance board. In: 6th workshop on virtual and augmented reality
Holland C, Komogortsev OV (2011) Biometric identification via eye movement scanpaths in reading. In: International joint conference on biometrics (IJCB), pp 1–8
Holland CD, Komogortsev OV (2013) Complex eye movement pattern biometrics: Analyzing fixations and saccades. In: International conference on biometrics (ICB), pp 1–8
Hou WJ, Chen KX, Li H, Zhou H (2018) User defined eye movement-based interaction for virtual reality. In: International conference on cross-cultural design, pp 18–30
Hutchinson TE, White KP, Martin WN, Reichert KC, Frey LA (1989) Human-computer interaction using eye-gaze input. IEEE Trans Syst Man Cybern 19 (6):1527–1534
Ibrahim LF, Abulkhair M, AlShomrani AD, Manal AG, Ameerah AM, Fadiah AG (2014) Using Haar classifiers to detect driver fatigue and provide alerts. Multimedia Tools and Applications 71(3):1857–1877
Jacob RJ (1990) The use of eye movements in human-computer interaction techniques: what you look at is what you get. In: Proceedings of the SIGCHI conference on Human factors in computing systems, pp 11–18
Joakim K (2017) Eye tracking is virtual reality’s next frontier. Available: https://venturebeat.com/2017/09/06/. Accessed 12 Aug 2018
John SJ, Sharmila ST (2018) Real time blink recognition from various head pose using single eye. Multimedia Tools and Applications: 1–15
Kaaman A, Bornemark K (2017) Gaze-supported interaction with smart objects through an augmented reality user interface
Kasprowski P, Ober J (2004) Eye movements in biometrics. In: International workshop on biometric authentication. Springer, Berlin, pp 248–258
Khamis M, Hoesl A, Klimczak A, Reiss M, Alt F, Bulling A (2017) Eyescout: Active eye tracking for position and movement independent gaze interaction with large public displays. In: Proceedings of the 30th annual ACM symposium on user interface software and technology, pp 155–166
Khamis M, Oechsner C, Alt F, Bulling A (2018) VRPursuits: interaction in virtual reality using smooth pursuit eye movements. In: Proceedings of the 2018 international conference on advanced visual interfaces (AVI’18), vol 7, p 18
Khamis M, Saltuk O, Hang A, Stolz K, Bulling A, Alt F (2016) TextPursuits: using text for pursuits-based interaction and calibration on public displays. In: Proceedings of the ACM international joint conference on pervasive and ubiquitous computing, pp 274–285
Kim K, Choi J, Kim J, Lee S (2015) Depth camera-based 3D hand gesture controls with immersive tactile feedback for natural mid-air gesture interactions. Sensors 15(1):1022–1046
Kim M, Lee JY (2016) Touch and hand gesture-based interactions for directly manipulating 3D virtual objects in mobile augmented reality. Multimedia Tools and Applications 75(23):16529– 16550
Kinnunen T, Sedlak F, Bednarik R (2010) Towards task-independent person authentication using eye movement signals. In: Proceedings of the symposium on eye-tracking research & applications: pp 187–190
Komogortsev OV, Jayarathna S, Aragon CR, Mahmoud M (2010) Biometric identification via an oculomotor plant mathematical model. In: Proceedings of the 2010 Symposium on Eye-Tracking Research & Applications: pp 57–60
Kowalczyk P, Sawicki D (2019) Blink and wink detection as a control tool in multimodal interaction. Multimedia Tools and Applications 78(10):13749–13765
Kumar A, Malhotra S (2015) Real-time human skin color detection algorithm using skin color map. In: 2nd international conference on computing for sustainable global development (INDIACom), pp 2002–2006
Laddi A, Prakash NR (2019) Multimed Tools Appl. https://doi.org/10.1007/s11042-019-07940-3
Leigh RJ, Zee DS (1999) The neurology of eye movements. Oxford University Press, New York
Li C, Xue J, Quan C, Yue J, Zhang C (2018) Biometric recognition via texture features of eye movement trajectories in a visual searching task. PloS one 13(4):e0194475
Linn AG (2017) Gaze teleportation in virtual reality
Long AC, Landay JA, Rowe LA (2001) Quill: a gesture design tool for pen-based user interfaces. University of California, Berkeley
Maheswari S, Korah R (2017) Enhanced skin tone detection using heuristic thresholding. Biomed Res 28(9):29–35
Majaranta P, Aoki H, Donegan M, Hansen DW, Hansen JP, Hyrskykari A, Räihä KJ (2011) Gaze interaction and applications of eye tracking. IGI Global, DOI, 10: 978-1
Mardanbegi D, Hansen DW (2011) Mobile gaze-based screen interaction in 3D environments. In: Proceedings of the 1st conference on novel gaze-controlled applications, p 2
Matsumoto Y, Zelinsky A (2000) An algorithm for real-time stereo vision implementation of head pose and gaze direction measurement. In fg: p 499
Memo A, Zanuttigh P (2018) Head-mounted gesture controlled interface for human-computer interaction. Multimedia Tools and Applications 77(1):27–53
Messaci A, Zenati N, Bellarbi A, Belhocine M (2015) 3D interaction techniques using gestures recognition in virtual environment. In: 4th International conference on electrical engineering (ICEE), pp 1–5
Morimoto CH, Mimica MR (2005) Eye gaze tracking techniques for interactive applications. Comput Vis Image Und 98(1):4–24
Mousas C, Anagnostopoulos CN (2017) Real-time performance-driven finger motion synthesis. Comput Graph 68:1–11
Oh U, Findlater L (2013) The challenges and potential of end-user gesture customization. In: Proceedings of the SIGCHI conference on human factors in computing systems, pp 1129–1138
Padilla R, Costa Filho CF, Costa MF (2012) Evaluation of haar cascade classifiers designed for face detection. World Academy of Science Eng Technol 64:362–365
Padmanaban N, Konrad R, Stramer T, Cooper EA, Wetzstein G (2017) Optimizing virtual reality for all users through gaze-contingent and adaptive focus displays. In: Proceedings of the National Academy of Sciences, 201617251
Palinko O, Kun AL, Shyrokov A, Heeman P (2010) Estimating cognitive load using remote eye tracking in a driving simulator. In: Proceedings of the 2010 symposium on eye-tracking research & applications, pp 141–144
Pfeiffer T, Latoschik ME, Wachsmuth I (2008) Evaluation of binocular eye trackers and algorithms for 3D gaze interaction in virtual reality environments. JVRB-Journal of Virtual Reality and Broadcasting 5(16):1660
Pham HA (2018) The challenge of hand gesture interaction in the virtual reality environment: evaluation of in-air hand gesture using the leap motion controller
Phung SL, Bouzerdoum A, Chai D (2002) A novel skin color model in ycbcr color space and its application to human face detection. In: International conference on image processing, vol 1 , pp I–I
Piumsomboon T, Lee G, Lindeman RW, Billinghurst M (2017) Exploring natural eye-gaze-based interaction for immersive virtual reality. In: IEEE Symposium on 3D user interfaces (3DUI), pp 36–39
Porta M (2015) A study on text entry methods based on eye gestures. J Assist Technol 9(1):48–67
Prabhakar G, Biswas P (2018) Eye gaze controlled projected display in automotive and military aviation environments. Multimodal Technologies and Interaction 2(1):1
Punpongsanon P, Guy E, Iwai D, Sato K, Boubekeur T (2017) Extended LazyNav: Virtual 3D ground navigation for large displays and head-mounted displays. IEEE Trans Vis Comput Graph 23(8):1952–1963
Raees M, Uallah S, Rahman SU (2017) CHORDBALL: A rotation technique for 3D virtual environments. Pakistan Journal of Science 69(1):85–94
Raees M, Ullah S (2019) GIFT: Gesture-Based interaction by fingers tracking, an interaction technique for virtual environment. IJIMAI 5(5):115–125
Raees M, Ullah S, Rahman SU (2018) VEN-3DVE: Vision based egocentric navigation for 3D virtual environments. International Journal on Interactive Design and Manufacturing (IJIDeM): 1–11
Raees M, Ullah S, Rahman SU, Rabbi I (2016) Image based recognition of Pakistan sign language. J Engine Res 4(1):21–41
Reale MJ, Canavan S, Yin L, Hu K, Hung T (2011) A multi-gesture interaction system using a 3-D iris disk model for gaze estimation and an active appearance model for 3-D hand pointing. IEEE Transactions on Multimedia 13 (3):474–486
Rodriguez JD, Ousler GW, Johnston PR, Lane K, Abelson MB (2013) Investigation of extended blinks and interblink intervals in subjects with and without dry eye. Clinical Ophthalmology (Auckland, NZ) 7:337
Rubio-Tamayo JL, Gertrudix BM, García GF (2017) Immersive environments and virtual reality: Systematic review and advances in communication, interaction and simulation. Multimodal Technologies and Interaction 1(4):21
Sayers H (2004) Desktop virtual environments: a study of navigation and age. Interact Comput 16(5):939–956
Schultheis H, Jameson A (2004) Assessing cognitive load in adaptive hypermedia systems: Physiological and behavioral methods. In: International conference on adaptive hypermedia and adaptive web-based systems, pp 225–234
Sorgalla J, Fleck J, Sachweh S (2018) ARGI: Augmented reality for gesture-based interaction in variable smart environments. In: VISIGRAPP, pp 102–107
Stellmach S, Dachselt R (2012) Designing gaze-based user interfaces for steering in virtual environments. In: Proceedings of the symposium on eye tracking research and applications, pp 131–138
Stellmach S, Dachselt R (2012) Look & touch: gaze-supported target acquisition. In: Proceedings of the SIGCHI conference on human factors in computing systems, pp 2981–2990
Sun Q, Patney A, Wei LY, Shapira O, Lu J, Asente ZS, McGuire M, Luebke D, Kaufman A (2018) Towards virtual reality infinite walking: dynamic saccadic redirection. ACM Transactions on Graphics (TOG) 37(4):67
Tapu R, Mocanu B, Tapu E (2014) A survey on wearable devices used to assist the visual impaired user navigation in outdoor environments. In: 11th international symposium on electronics and telecommunications (ISETC), pp 1–4
Terziman L, Marchal M, Emily M, Multon F, Arnaldi B, Lécuyer A (2010) Shake-your-head: Revisiting walking-in-place for desktop virtual reality. In: Proceedings of the 17th ACM symposium on virtual reality software and technology, pp 27–34
Triesch J, Sullivan BT, Hayhoe MM, Ballard DH (2002) Saccade contingent updating in virtual reality. In: Proceedings of the symposium on Eye tracking research & applications, pp 95–102
Vafadar M, Behrad A (2015) A vision based system for communicating in virtual reality environments by recognizing human hand gestures. Multimedia Tools and Applications 74(18):7515– 7535
Vanacken D, Beznosyk A, Coninx K (2014) Help systems for gestural interfaces and their effect on collaboration and communication. In: Workshop on gesture-based interaction design: communication and cognition
Velloso E, Carter M, Newn J, Esteves A, Clarke C, Gellersen H (2017) Motion correlation: Selecting objects by matching their movement. ACM Transactions on Computer-Human Interaction (TOCHI) 24(3):22
Velloso E, Turner J, Alexander J, Bulling A, Gellersen H (2015) An empirical investigation of gaze selection in mid-air gestural 3D manipulation. In: Human-computer interaction, pp 315–330
Velloso E, Wirth M, Weichel C, Esteves A, Gellersen H (2016) AmbiGaze: Direct control of ambient devices by gaze. In: Proceedings of the ACM conference on designing interactive systems, pp 812– 817
Vezzetti E, Calignano F, Moos S (2010) Computer-aided morphological analysis for maxillo-facial diagnostic: a preliminary study. J Plast Reconstr Aes Surg 63(2):218–226
Vezzetti E, Marcolin F, Stola V (2013) 3D human face soft tissues landmarking method: An advanced approach. Computers in Industry 64(9):1326–1354
Vidal M, Bulling A, Gellersen H (2013) Pursuits: spontaneous interaction with displays based on smooth pursuit eye movement and moving targets. In: Proceedings of the ACM international joint conference on Pervasive and ubiquitous computing, pp 439–448
Viola P, Jones M (2001) Rapid object detection using a boosted cascade of simple features. In: Proceedings of the IEEE computer society conference on computer vision and pattern recognition (CVPR), vol 1, pp I–I
Wu H, Wang J (2016) A visual attention-based method to address the midas touch problem existing in gesture-based interaction. Vis Comput 32(1):123–136
Zhang Y, Stellmach S, Sellen A, Blake A (2015) The costs and benefits of combining gaze and hand gestures for remote interaction. In: Human-computer interaction, pp 570–577
Zielasko D, Horn S, Freitag S, Weyers B, Kuhlen TW (2016) Evaluation of hands-free HMD-based navigation techniques for immersive data analysis. In: IEEE symposium on 3D user interfaces (3DUI): 113–119
Author information
Authors and Affiliations
Corresponding author
Additional information
Publisher’s note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
About this article
Cite this article
Raees, M., Ullah, S. THE-3DI: Tracing head and eyes for 3D interactions. Multimed Tools Appl 79, 1311–1337 (2020). https://doi.org/10.1007/s11042-019-08305-6
Received:
Revised:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11042-019-08305-6