skip to main content
10.1145/1056808.1056873acmconferencesArticle/Chapter ViewAbstractPublication PageschiConference Proceedingsconference-collections
Article

Combining head tracking and mouse input for a GUI on multiple monitors

Published: 02 April 2005 Publication History

Abstract

The use of multiple LCD monitors is becoming popular as prices are reduced, but this creates problems for window management and switching between applications. For a single monitor, eye tracking can be combined with the mouse to reduce the amount of mouse movement, but with several monitors the head is moved through a large range of positions and angles which makes eye tracking difficult. We thus use head tracking to switch the mouse pointer between monitors and use the mouse to move within each monitor. In our experiment users required significantly less mouse movement with the tracking system, and preferred using it, although task time actually increased. A graphical prompt (flashing star) prevented the user losing the pointer when switching monitors. We present discussions on our results and ideas for further developments.

References

[1]
M. Czerwinski, G. Smith, T. Regan, B. Meyers, G. Robertson, and G. Starkweather. Toward Characterizing the Productivity Benefits of Very Large Displays. In Proc. Interact 2003, pages 9--16, 2003.
[2]
H. Goosssens and A. V. Opstal. Human Eye-Head Coordination in Two Dimensions Under Different Sensorimotor Conditions. Exp. Brain Research, 114:542--560, 1997.
[3]
J. Grudin. Partitioning Digital Worlds: Focal and Peripheral Awareness in Multiple Monitor Use. In Proc. CHI 2001, pages 458--465, 2001.
[4]
R. J. Jacob. Advances in Human-Computer Interaction, Vol. 4, ed. by H.R. Hartson and D. Hix, chapter Eye Movement-Based Human-Computer Interaction Techniques: Toward Non-Command Interfaces, pages 151--190. Ablex Publishing Co., 1993.
[5]
K. Kitajima, Y. Sato, and H. Koike. Vision-Based Face Tracking System for Window Interface: Prototype Application and Emperical Studies. In Proc. CHI 2001, pages 359--360, 2001.
[6]
K. Oka, Y. Sato, Y. Nakanishi, and H. Koike. Head Pose Estimation System Based on Particle Filtering with Adaptive Diffusion Control. In Proc. MVA 2005, 2005.
[7]
D. D. Salvucci and J. H. Goldberg. Indentifying Fixations and Saccades in Eye-Tracking Protocols. In Proc. Symp. Eye Tracking Research & Applications, pages 71--78, 2000.
[8]
L. Vacchetti, V. Lepetit, and P. Fua. Stable Real-Time 3D Tracking Using Online and Offline Information. IEEE Trans. PAMI, 26(10):1385--1391, 2004.
[9]
R. Vertegaal, I. Weevers, C. Sohn, and C. Cheung. GAZE-2: Conveying Eye Contact in Group Video Conferencing Using Eye-Controlled Camera Direction. In Proc. CHI 2003, pages 521--528, 2003.
[10]
S. Zhai, C. Morimoto, and S. Ihde. Manual And Gaze Input Cascaded (MAGIC) Pointing. In Proc. CHI 99, pages 246--253, 1999.

Cited By

View all
  • (2023)Mixed Reality Interaction TechniquesSpringer Handbook of Augmented Reality10.1007/978-3-030-67822-7_5(109-129)Online publication date: 1-Jan-2023
  • (2021)Head orientation control of projection area for projected virtual hand interface on wheelchairSICE Journal of Control, Measurement, and System Integration10.1080/18824889.2021.196478514:1(223-232)Online publication date: 27-Aug-2021
  • (2020)Eye Gaze Controlled Head-up DisplayICT Analysis and Applications10.1007/978-981-15-0630-7_46(471-479)Online publication date: 4-Feb-2020
  • Show More Cited By

Index Terms

  1. Combining head tracking and mouse input for a GUI on multiple monitors

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    CHI EA '05: CHI '05 Extended Abstracts on Human Factors in Computing Systems
    April 2005
    1358 pages
    ISBN:1595930027
    DOI:10.1145/1056808
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 02 April 2005

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. attentive user interface
    2. gaze-contingent display
    3. head tracking
    4. multiple monitors

    Qualifiers

    • Article

    Conference

    CHI05
    Sponsor:

    Acceptance Rates

    Overall Acceptance Rate 6,164 of 23,696 submissions, 26%

    Upcoming Conference

    CHI 2025
    ACM CHI Conference on Human Factors in Computing Systems
    April 26 - May 1, 2025
    Yokohama , Japan

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)43
    • Downloads (Last 6 weeks)5
    Reflects downloads up to 15 Feb 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2023)Mixed Reality Interaction TechniquesSpringer Handbook of Augmented Reality10.1007/978-3-030-67822-7_5(109-129)Online publication date: 1-Jan-2023
    • (2021)Head orientation control of projection area for projected virtual hand interface on wheelchairSICE Journal of Control, Measurement, and System Integration10.1080/18824889.2021.196478514:1(223-232)Online publication date: 27-Aug-2021
    • (2020)Eye Gaze Controlled Head-up DisplayICT Analysis and Applications10.1007/978-981-15-0630-7_46(471-479)Online publication date: 4-Feb-2020
    • (2019)Interactive gaze and finger controlled HUD for carsJournal on Multimodal User Interfaces10.1007/s12193-019-00316-9Online publication date: 23-Nov-2019
    • (2018)Eye Gaze Controlled Projected Display in Automotive and Military Aviation EnvironmentsMultimodal Technologies and Interaction10.3390/mti20100012:1(1)Online publication date: 17-Jan-2018
    • (2018)Find the ‘Lost’ Cursor: A Comparative Experiment of Visually Enhanced Cursor TechniquesIntelligent Computing Theories and Application10.1007/978-3-319-95933-7_11(85-92)Online publication date: 6-Jul-2018
    • (2018)Head Tracking for Video GamesDesigning Immersive Video Games Using 3DUI Technologies10.1007/978-3-319-77953-9_3(33-51)Online publication date: 30-May-2018
    • (2017)Improving eye gaze controlled car dashboard using simulated annealingProceedings of the 31st British Computer Society Human Computer Interaction Conference10.14236/ewic/HCI2017.39(1-12)Online publication date: 3-Jul-2017
    • (2016)Attention Estimation for Input Switch in Scalable Multi-display EnvironmentsNeural Information Processing10.1007/978-3-319-46681-1_40(329-336)Online publication date: 30-Sep-2016
    • (2015)Palpebrae superiorisProceedings of the 41st Graphics Interface Conference10.5555/2788890.2788938(273-280)Online publication date: 3-Jun-2015
    • Show More Cited By

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Figures

    Tables

    Media

    Share

    Share

    Share this Publication link

    Share on social media