skip to main content
research-article

Performing Locomotion Tasks in Immersive Computer Games with an Adapted Eye-Tracking Interface

Published: 01 September 2013 Publication History

Abstract

Young people with severe physical disabilities may benefit greatly from participating in immersive computer games. In-game tasks can be fun, engaging, educational, and socially interactive. But for those who are unable to use traditional methods of computer input such as a mouse and keyboard, there is a barrier to interaction that they must first overcome. Eye-gaze interaction is one method of input that can potentially achieve the levels of interaction required for these games. How we use eye-gaze or the gaze interaction technique depends upon the task being performed, the individual performing it, and the equipment available. To fully realize the impact of participation in these environments, techniques need to be adapted to the person’s abilities. We describe an approach to designing and adapting a gaze interaction technique to support locomotion, a task central to immersive game playing. This is evaluated by a group of young people with cerebral palsy and muscular dystrophy. The results show that by adapting the interaction technique, participants are able to significantly improve their in-game character control.

References

[1]
Bates, R., Vickers, S., and Istance, H. 2009. Gaze interaction with virtual on-line communities: Levelling the playing field for disabled users. Springerlink. Universal Access Inf. Soc. 9, 3, 261--272.
[2]
Bigham, J. P., Kaminsky, R. S., Ladner, R. E., Danielsson, O. M., and Hempton, G. L. 2006. Webinsight: Making web images accessible. In Proceedings of the 8th International ACM SIGACCESS Conference on Computers and Accessibility (Assets’06). ACM, New York, 181--188.
[3]
Burigat, S. and Chittaro, L. 2007. Navigation in 3d virtual environments: Effects of user experience and location-pointing navigation aids. Int. J. Hum. Comput. Stud. 65, 11, 945--958.
[4]
Card, S., Moran, T., and Newell, A. 1983. The Psychology of Human-Computer Interaction. L. Erlbaum Associates, Hillsdale, NJ.
[5]
Carter, S., Hurst, A., Mankoff, J., and Li, J. 2006. Dynamically adapting GUIs to diverse input devices. In Proceedings of the 8th International ACM SIGACCESS Conference on Computers and Accessibility (Assets’06). ACM, New York, NY, 63--70.
[6]
Dawe, M. 2006. Desperately seeking simplicity: How young adults with cognitive disabilities and their families adopt assistive technologies. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI’06). ACM, New York, 1143--1152.
[7]
Donegan, M., Oosthuizen, L., Bates, R., Daunys, G., Hansen, J., Joos, M., Majaranta, P., and Signorile, I. 2005. D3.1 user requirements report with observations of difficulties users are experiencing. Communication by Gaze Interaction: COGAIN, IST-2003-511598: Deliverable 3.1.
[8]
Donegan, M., Oosthuizen, L., Bates, R., Istance, H., Holmqvist, E., and Lundlv, M. 2006. D3.3 report of user trials and usability studies. Communication by Gaze Interaction: COGAIN, IST-2003-511598: Deliverable 3.3.
[9]
Dorr, M., Bohme, M., Martinetz, T., and Barth, E. 2007. Gaze beats mouse: A case study. In Proceedings of the 3rd Conference on Communication by Gaze Interaction (COGAIN’07).
[10]
Folmer, E., Liu, F., and Ellis, B. 2011. Navigating a 3d avatar using a single switch. In Proceedings of the 6th International Conference on Foundations of Digital Games (FDG’11). ACM, New York, 154--160.
[11]
Gajos, K. Z., Wobbrock, J. O., and Weld, D. S. 2008. Improving the performance of motor-impaired users with automatically-generated, ability-based interfaces. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI’08). ACM, New York, 1257--1266.
[12]
Hand, C. 1997. A survey of 3d interaction techniques. Comput. Graphics Forum 16, 5, 269--282.
[13]
ISO. 2000. ISO 9241-9:2000. Ergonomic requirements for office work with visual display terminals (VDTs)--Part 9: Requirements for non-keyboard input devices. International Organization for Standardization.
[14]
Isokoski, P. and Martin, B. 2006. Eye tracker input in first person shooter games. In Proceedings of the 3rd Conference on Communication by Gaze Interaction (COGAIN’06).
[15]
Isokoski, P., Joos, M., Martin, M., and Spakov, O. 2009. Gaze controlled games. Universal Access Inf. Soc. 8, 323--337.
[16]
Istance, H., Bates, R., Hyrskykari, A., and Vickers, S. 2008. Snap clutch, a moded approach to solving the midas touch problem. In Proceedings of the Symposium on Eye Tracking Research and Applications (ETRA’08). ACM, New York, 221--228.
[17]
Istance, H., Vickers, S., and Hyrskykari, A. 2009. Gaze-based interaction with massively multiplayer on-line games. In CHI’09 Extended Abstracts on Human Factors in Computing Systems (CHI EA’09). ACM, New York, 4381--4386.
[18]
Istance, H., Vickers, S., and Hyrskykari, A. 2012. The validity of using non-representative users in gaze communication research. In Proceedings of the Symposium on Eye Tracking Research and Applications (ETRA’12). ACM, New York, 233--236.
[19]
Jacob, R. J. K. 1990. What you look at is what you get: Eye movement-based interaction techniques. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI’90). 11--18.
[20]
Johansen, A. S., Hansen, J. P., Hansen, D. W., Itoh, K., and Mashino, S. 2003. Language technology in a predictive, restricted on-screen keyboard with dynamic layout for severely disabled people. In Proceedings of the 2003 EACL Workshop on Language Modeling for Text Entry Methods (EACL’03). 59--66.
[21]
Jönsson, E. 2005. If looks could kill - An evaluation of eye tracking in computer games. M.S. thesis, Royal Institute of Technology, Stockholm.
[22]
Kumar, M., Paepcke, A., and Winograd, T. 2007. Eyepoint: Practical pointing and selection using gaze and keyboard. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI’07). ACM, New York, 421--430.
[23]
Majaranta, P., MacKenzie, I. S., Aula, A., and Raiha, K. J. 2006. Effects of feedback and dwell time on eye typing speed and accuracy. Universal Access Inf. Soc. 5, 2, 199--208.
[24]
Majaranta, P. and Räihä, K.-J. 2002. Twenty years of eye typing: Systems and design issues. In Proceedings of the Symposium On Eye Tracking Research and Applications (ETRA’02). ACM, New York, NY, 15--22.
[25]
Mankoff, J., Dey, A., Batra, U., and Moore, M. 2002. Web accessibility for low bandwidth input. In Proceedings of the 5th International ACM Conference on Assistive Technologies (Assets’02). ACM, New York, 17--24.
[26]
Randolph, A. B. and Moore Jackson, M. M. 2010. Assessing fit of nontraditional assistive technologies. ACM Trans. Access. Comput. 2, 4, 16:1--16:31.
[27]
San Agustin, J., Hansen, J. P., and Mateo, J. 2008. Hands-free selection by combining gaze pointing and EMG selection. In Scandinavian Workshop on Applied Eye Tracking (SWAET’08).
[28]
Sears, A. and Hanson, V. 2011. Representing users in accessibility research. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI’11). ACM, New York, 2235--2238.
[29]
Skovsgaard, H., Mateo, J. C., and Flach, M. F. H. J. P. 2010. Small-target selection with gaze alone. In Symposium on Eye Tracking Research and Applications (ETRA’10).
[30]
Smith, J. D. and Graham, T. C. N. 2006. Use of eye movements for video game control. In Proceedings of the ACM SIGCHI International Conference on Advances in Computer Entertainment Technology (ACE’06). ACM, New York.
[31]
Stampe, D. M. and Reingold, E. M. 1995. Selection by looking: A novel computer interface and its application to psychological research. In Eye Movement Research: Mechanisms, Processes and Applications. 467--478.
[32]
Sternberg, R. 2003. Cognitive Psychology. 3rd Ed., Thomson Wadsworth.
[33]
Tan, D. S., Robertson, G. G., and Czerwinski, M. 2001. Exploring 3D navigation: Combining speed-coupled flying with orbiting. In Proceedings of the Conference on Human Factors in Computing Systems; Anyone Anywhere (CHI’01), J. A. Jacko Ed., ACM, New York, 418--425.
[34]
Trewin, S., Laff, M., Hanson, V., and Cavender, A. 2009. Exploring visual and motor accessibility in navigating a virtual world. ACM Trans. Access. Comput. 2, 2, 11:1--11:35.
[35]
Vickers, S., Istance, H., Hyrskykari, A., Ali, N., and Bates, R. 2008. Keeping an eye on the game: Eye gaze interaction with massively multiplayer online games and virtual communities for motor impaired users. In Proceedings of the 7th International Conference on Disability, Virtual Reality and Associated Technologies (ICDVRAT’08). 159--166.
[36]
Ware, C. and Mikaelian, H. H. 1987. An evaluation of an eye tracker as a device for computer input. In Proceedings of the SIGCHI/GI Conference on Human Factors in Computing Systems and Graphics Interface (CHI’87). ACM, New York, 183--188.
[37]
Wobbrock, J. O., Kane, S. K., Gajos, K. Z., Harada, S., and Froehlich, J. 2011. Ability-based design: Concept, principles and examples. ACM Trans. Access. Comput. 3, 3, 9:1--9:27.
[38]
Yuan, B., Folmer, E., and Harris, F. C., Jr., 2011. Game accessibility: a survey. Universal Access Inf. Soc. 10, 1, 81--100.
[39]
Zhai, S., Morimoto, C., and Ihde, S. 1999. Manual and gaze input cascaded (magic) pointing. In The CHI is the Limit, Human Factors in Computing Systems (CHI’99). 246--253.

Cited By

View all
  • (2024)Snap, Pursuit and Gain: Virtual Reality Viewport Control by GazeProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642838(1-14)Online publication date: 11-May-2024
  • (2024)Actively Viewing the Audio-Visual Media: An Eye-Controlled Application for Experience and Learning in Traditional Chinese Painting ExhibitionsInternational Journal of Human–Computer Interaction10.1080/10447318.2024.2371691(1-29)Online publication date: 23-Jul-2024
  • (2022)A systematic mapping study on digital game adaptation dimensionsProceedings of the 21st Brazilian Symposium on Human Factors in Computing Systems10.1145/3554364.3559122(1-14)Online publication date: 17-Oct-2022
  • Show More Cited By

Index Terms

  1. Performing Locomotion Tasks in Immersive Computer Games with an Adapted Eye-Tracking Interface

      Recommendations

      Comments

      Information & Contributors

      Information

      Published In

      cover image ACM Transactions on Accessible Computing
      ACM Transactions on Accessible Computing  Volume 5, Issue 1
      September 2013
      59 pages
      ISSN:1936-7228
      EISSN:1936-7236
      DOI:10.1145/2531922
      Issue’s Table of Contents
      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      Published: 01 September 2013
      Accepted: 01 June 2013
      Revised: 01 May 2013
      Received: 01 October 2012
      Published in TACCESS Volume 5, Issue 1

      Permissions

      Request permissions for this article.

      Check for updates

      Author Tags

      1. AAC
      2. Accessible computing
      3. adaptive interface
      4. disability
      5. eye-gaze
      6. eye-tracking
      7. game accessibility

      Qualifiers

      • Research-article
      • Research
      • Refereed

      Contributors

      Other Metrics

      Bibliometrics & Citations

      Bibliometrics

      Article Metrics

      • Downloads (Last 12 months)52
      • Downloads (Last 6 weeks)8
      Reflects downloads up to 25 Jan 2025

      Other Metrics

      Citations

      Cited By

      View all
      • (2024)Snap, Pursuit and Gain: Virtual Reality Viewport Control by GazeProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642838(1-14)Online publication date: 11-May-2024
      • (2024)Actively Viewing the Audio-Visual Media: An Eye-Controlled Application for Experience and Learning in Traditional Chinese Painting ExhibitionsInternational Journal of Human–Computer Interaction10.1080/10447318.2024.2371691(1-29)Online publication date: 23-Jul-2024
      • (2022)A systematic mapping study on digital game adaptation dimensionsProceedings of the 21st Brazilian Symposium on Human Factors in Computing Systems10.1145/3554364.3559122(1-14)Online publication date: 17-Oct-2022
      • (2022)Methodological Standards in Accessibility Research on Motor Impairments: A SurveyACM Computing Surveys10.1145/354350955:7(1-35)Online publication date: 15-Dec-2022
      • (2022)A systematic literature review for the use of eye-tracking in special educationEducation and Information Technologies10.1007/s10639-022-11456-z28:6(6515-6540)Online publication date: 18-Nov-2022
      • (2021)Biofeedback Methods in Entertainment Video GamesProceedings of the ACM on Human-Computer Interaction10.1145/34746955:CHI PLAY(1-32)Online publication date: 6-Oct-2021
      • (2021)A Multimodal Direct Gaze Interface for Wheelchairs and Teleoperated Robots2021 43rd Annual International Conference of the IEEE Engineering in Medicine & Biology Society (EMBC)10.1109/EMBC46164.2021.9630471(4796-4800)Online publication date: 1-Nov-2021
      • (2021)Prospective on Eye-Tracking-based Studies in Immersive Virtual Reality2021 IEEE 24th International Conference on Computer Supported Cooperative Work in Design (CSCWD)10.1109/CSCWD49262.2021.9437692(861-866)Online publication date: 5-May-2021
      • (2020)Semi-Autonomous Robotic Avatar System for Patients with Motor Disabilities2020 Joint 11th International Conference on Soft Computing and Intelligent Systems and 21st International Symposium on Advanced Intelligent Systems (SCIS-ISIS)10.1109/SCISISIS50064.2020.9322771(1-5)Online publication date: 5-Dec-2020
      • (2020)Cognitive Modeling Based on Perceiving-Acting Cycle in Robotic Avatar System for Disabled Patients2020 International Joint Conference on Neural Networks (IJCNN)10.1109/IJCNN48605.2020.9206705(1-6)Online publication date: Jul-2020
      • Show More Cited By

      View Options

      Login options

      Full Access

      View options

      PDF

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      Figures

      Tables

      Media

      Share

      Share

      Share this Publication link

      Share on social media