skip to main content
10.1145/2470654.2470695acmconferencesArticle/Chapter ViewAbstractPublication PageschiConference Proceedingsconference-collections
research-article

Still looking: investigating seamless gaze-supported selection, positioning, and manipulation of distant targets

Published: 27 April 2013 Publication History

Abstract

We investigate how to seamlessly bridge the gap between users and distant displays for basic interaction tasks, such as object selection and manipulation. For this, we take advantage of very fast and implicit, yet imprecise gaze- and head-directed input in combination with ubiquitous smartphones for additional manual touch control. We have carefully elaborated two novel and consistent sets of gaze-supported interaction techniques based on touch-enhanced gaze pointers and local magnification lenses. These conflict-free sets allow for fluently selecting and positioning distant targets. Both sets were evaluated in a user study with 16 participants. Overall, users were fastest with a touch-enhanced gaze pointer for selecting and positioning an object after some training. While the positive user feedback for both sets suggests that our proposed gaze- and head-directed interaction techniques are suitable for a convenient and fluent selection and manipulation of distant targets, further improvements are necessary for more precise cursor control.

Supplementary Material

suppl.mov (chi0463-file3.mp4)
Supplemental video

References

[1]
Argelaguet, F., and Andujar, C. Efficient 3D pointing selection in cluttered virtual environments. Computer Graphics and Appl., IEEE 29, 6 (11 2009), 34--43.
[2]
Ashmore, M., Duchowski, A. T., and Shoemaker, G. Efficient eye pointing with a fisheye lens. In Proc. of GI '05 (2005), 203--210.
[3]
Bezerianos, A., and Balakrishnan, R. The vacuum: Facilitating the manipulation of distant objects. In Proc. of CHI '05, ACM (2005), 361--370.
[4]
Bieg, H.-J., Chuang, L. L., Fleming, R. W., Reiterer, H., and Bülthoff, H. H. Eye and pointer coordination in search and selection tasks. In Proc. of ETRA'10, ACM (2010), 89--92.
[5]
Bolt, R. A. Gaze-orchestrated dynamic windows. In Proc. of SIGGRAPH '81, ACM (1981), 109--119.
[6]
Boring, S., Baur, D., Butz, A., Gustafson, S., and Baudisch, P. Touch projector: mobile interaction through video. In Proc. of CHI '10, ACM (2010), 2287--2296.
[7]
Bulling, A., and Gellersen, H. Toward mobile eye-based human-computer interaction. IEEE Pervasive Computing 9 (2010), 8--12.
[8]
Drewes, H., and Schmidt, A. The MAGIC touch: Combining MAGIC-pointing with a touch-sensitive mouse. In Proc. of INTERACT'09, Springer-Verlag (2009), 415--428.
[9]
Han, S., Lee, H., Park, J., Chang, W., and Kim, C. Remote interaction for 3D manipulation. In Proc. of CHI EA '10, ACM (2010), 4225--4230.
[10]
Jacob, R. J. K. What you look at is what you get: eye movement-based interaction techniques. In Proc. of CHI '90, ACM (1990), 11--18.
[11]
Kaiser, E., Olwal, A., McGee, D., Benko, H., Corradini, A., Li, X., Cohen, P., and Feiner, S. Mutual disambiguation of 3D multimodal interaction in augmented and virtual reality. In Proc. of ICMI '03, ACM (2003), 12--19.
[12]
Keefe, D. F., Gupta, A., Feldman, D., Carlis, J. V., Krehbiel Keefe, S., and Griffin, T. J. Scaling up multi-touch selection and querying: Interfaces and applications for combining mobile multi-touch input with large-scale visualization displays. Int. J. Hum.-Comput. Stud. 70, 10 (Oct. 2012), 703--713.
[13]
Kumar, M., Paepcke, A., and Winograd, T. EyePoint: practical pointing and selection using gaze and keyboard. In Proc. of CHI '07, ACM (2007), 421--430.
[14]
Monden, A., Matsumoto, K., and Yamato, M. Evaluation of gaze-added target selection methods suitable for general GUIs. Int. J. Comput. Appl. Technol. 24 (June 2005), 17--24.
[15]
Nancel, M., Wagner, J., Pietriga, E., Chapuis, O., and Mackay, W. Mid-air pan-and-zoom on wall-sized displays. In Proc. of CHI '11, ACM (2011), 177--186.
[16]
Stellmach, S., and Dachselt, R. Look & touch: gaze-supported target acquisition. In Proc. of CHI '12, ACM (2012), 2981--2990.
[17]
Stellmach, S., Stober, S., Nürnberger, A., and Dachselt, R. Designing gaze-supported multimodal interactions for the exploration of large image collections. In Proc. of NGCA'11, ACM (2011), 1--8.
[18]
Turner, J., Bulling, A., and Gellersen, H. Combining gaze with manual interaction to extend physical reach. In Proc. of PETMEI '11, ACM (2011), 33--36.
[19]
Vogel, D., and Balakrishnan, R. Distant freehand pointing and clicking on very large, high resolution displays. In Proc. of UIST'05, ACM (2005), 33--42.
[20]
Ware, C., and Mikaelian, H. H. An evaluation of an eye tracker as a device for computer input. In Proc. of CHI'87, ACM (1987), 183--188.
[21]
Yamamoto, M., Komeda, M., Nagamatsu, T., and Watanabe, T. Development of eye-tracking tabletop interface for media art works. In Proc. of ITS '10, ITS '10, ACM (2010), 295--296.
[22]
Yoo, B., Han, J.-J., Choi, C., Yi, K., Suh, S., Park, D., and Kim, C. 3D user interface combining gaze and hand gestures for large-scale display. In Proc. of CHI EA'10, ACM (2010), 3709--3714.
[23]
Zhai, S., Morimoto, C., and Ihde, S. Manual and gaze input cascaded (MAGIC) pointing. In Proc. of CHI'99, ACM (1999), 246--253.

Cited By

View all
  • (2024)Eye-Hand Movement of Objects in Near Space Extended RealityProceedings of the 37th Annual ACM Symposium on User Interface Software and Technology10.1145/3654777.3676446(1-13)Online publication date: 13-Oct-2024
  • (2024)Exploiting Physical Referent Features as Input for Multidimensional Data Selection in Augmented RealityACM Transactions on Computer-Human Interaction10.1145/364861331:4(1-40)Online publication date: 19-Sep-2024
  • (2024)Interactive Visualization on Large High‐Resolution Displays: A SurveyComputer Graphics Forum10.1111/cgf.1500143:6Online publication date: 30-Apr-2024
  • Show More Cited By

Index Terms

  1. Still looking: investigating seamless gaze-supported selection, positioning, and manipulation of distant targets

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    CHI '13: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
    April 2013
    3550 pages
    ISBN:9781450318990
    DOI:10.1145/2470654
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 27 April 2013

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. distant displays
    2. eye tracking
    3. gaze interaction
    4. mobile touch input
    5. positioning
    6. selection
    7. visual attention

    Qualifiers

    • Research-article

    Conference

    CHI '13
    Sponsor:

    Acceptance Rates

    CHI '13 Paper Acceptance Rate 392 of 1,963 submissions, 20%;
    Overall Acceptance Rate 6,199 of 26,314 submissions, 24%

    Upcoming Conference

    CHI 2025
    ACM CHI Conference on Human Factors in Computing Systems
    April 26 - May 1, 2025
    Yokohama , Japan

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)87
    • Downloads (Last 6 weeks)19
    Reflects downloads up to 16 Feb 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)Eye-Hand Movement of Objects in Near Space Extended RealityProceedings of the 37th Annual ACM Symposium on User Interface Software and Technology10.1145/3654777.3676446(1-13)Online publication date: 13-Oct-2024
    • (2024)Exploiting Physical Referent Features as Input for Multidimensional Data Selection in Augmented RealityACM Transactions on Computer-Human Interaction10.1145/364861331:4(1-40)Online publication date: 19-Sep-2024
    • (2024)Interactive Visualization on Large High‐Resolution Displays: A SurveyComputer Graphics Forum10.1111/cgf.1500143:6Online publication date: 30-Apr-2024
    • (2024)The impact of visual and motor space size on gaze-based target selectionAustralian Journal of Psychology10.1080/00049530.2024.230938476:1Online publication date: 5-Feb-2024
    • (2024)Förderlicher Entwurf cyber-physischer ProduktionssystemeHandbuch Industrie 4.010.1007/978-3-662-58528-3_132(189-223)Online publication date: 8-Nov-2024
    • (2023)Exploring 3D Interaction with Gaze Guidance in Augmented Reality2023 IEEE Conference Virtual Reality and 3D User Interfaces (VR)10.1109/VR55154.2023.00018(22-32)Online publication date: Mar-2023
    • (2022)Understanding and Creating Spatial Interactions with Distant Displays Enabled by Unmodified Off-The-Shelf SmartphonesMultimodal Technologies and Interaction10.3390/mti61000946:10(94)Online publication date: 19-Oct-2022
    • (2022)HandyGaze: A Gaze Tracking Technique for Room-Scale Environments using a Single SmartphoneProceedings of the ACM on Human-Computer Interaction10.1145/35677156:ISS(143-160)Online publication date: 14-Nov-2022
    • (2022)GazeDock: Gaze-Only Menu Selection in Virtual Reality using Auto-Triggering Peripheral Menu2022 IEEE Conference on Virtual Reality and 3D User Interfaces (VR)10.1109/VR51125.2022.00105(832-842)Online publication date: Mar-2022
    • (2022)An Evaluation of Caret Navigation Methods for Text Editing in Augmented Reality2022 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct)10.1109/ISMAR-Adjunct57072.2022.00132(640-645)Online publication date: Oct-2022
    • Show More Cited By

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Figures

    Tables

    Media

    Share

    Share

    Share this Publication link

    Share on social media