skip to main content
10.1145/1117309.1117340acmconferencesArticle/Chapter ViewAbstractPublication PagesetraConference Proceedingsconference-collections
Article

Eye typing with common cameras

Published:27 March 2006Publication History

ABSTRACT

Low cost eye tracking has received an increased attention due to the rapid developments in tracking hardware (video boards, digital camera and CPU's) [Hansen and Pece 2005; OpenEyes 2005]. We present a gaze typing system based on components that can be bought in most consumer hardware stores around the world. These components are for example cameras and graphics cards that are made in large quantities. This kind of hardware differs from what is often claimed to be "off-the-shelf components", but which in fact is hardware only available from particular vendors.Institutions that supply citizens with communication aids may be reluctant to invest large amounts of money in new equipment that they are unfamiliar with. Recent investiagtions estimate that less than 2000 systems have actually been used by Europeans, even though more than half a million disabled people in Europe could potentially benefit from it. The main group of present users consists of people with motor neuron disease (MND) and amyotrophic lateral sclerosis (ALS). If the price of gaze communication systems can be lowered, it could become a preferred means of control for a large group of people [Jordansen et al. 2005]. Present commercial gaze trackers e.g. [Tobii 2005; LC-Technologies 2004] are easy to use, robust and sufficiently accurate for many screen-based applications but their costs exceed the budget of most people.We use a standard uncalibrated 400$ Sony consumer camera (Sony handycam DCR-HC14E) to obtain the image data. The camera is stationary and placed on a tripod close (variable) to the monitor, but the geometry of the user, monitor and camera varies among sequences. However, the users are sitting about 50 - 60 cm away from a 17" screen. A typical example of the setup is shown in figure 1. We use Sony standard video option for 'night vision' to create an glint with the build-in IR light emitter.Eye tracking based on common components is subject to several unknown factors as various system parameters (i.e. camera parameters and geometry) are unknown. Algorithms that employ robust statistical principles to accommodate uncertainties in image data as well as in gaze estimates in the typing process are therefore needed. We propose to use the RANSAC algorithm [Fischler and Bolles 1981] for both robust maximum likelihood estimation of iris observations [Hansen and Pece 2005] as well as for handling outliers in the calibration procedure [Morimoto et al. 2000].Our low-resolution gaze tracker can be calibrated in less than 3 minutes by looking at 9 predefined positions on the screen. The users sit on a standard office chair without headrests or other physical constraints. Under these conditions we have succeeded in tracking the gaze of people, obtaining accuracies about 160 pixels on screen. This is still less than accuracies claimed by the best current off-the-shelf eye trackers systems (i.e. 30-60 pixels). However comparing these eye trackers wouldn't be correct as they are based on different hardware and image data.Low-cost gaze trackers do not need to be as accurate and robust as the commercial systems, if they are used together with applications designed to tolerate noisy inputs.We use the GazeTalk [COGAIN 2005] typing communication system components and have through proper design of the typing interface, reduced the need for high accuracy. We have observed typing speeds in the range of 3 - 5 words per minute for untrained subjects using large on-screen buttons and a new noise tolerant dwell-time principle. We modify the traditional dwell-time activation to one that maintains a full distribution of all hypothetical button selections and then activate one button when the evidence become high enough.

References

  1. COGAIN, 2005. http://www.cogain.org/.]]Google ScholarGoogle Scholar
  2. Fischler, M. A., and Bolles, R. C. 1981. Random sample consensus: a paradigm for model fitting with applications to image analysis and automated cartography. Commun. ACM 24, 6, 381--395.]] Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. Hansen, D. W., and Pece, A. E. 2005. Eye tracking in the wild. 182--210.]]Google ScholarGoogle Scholar
  4. Jordansen, I., Boedeker, S., Donegan, M., Oosthuizen, L., Girolamo., M., and Hansen, J. P., 2005. Report on a market study and demographics of user population (cogain) ist-2003-511598: Deliverable 7.2). http://www.cogain.org/results/reports/COGAIN-D7.2.pdf.]]Google ScholarGoogle Scholar
  5. LC-Technologies, 2004. http://www.eyegaze.com.]]Google ScholarGoogle Scholar
  6. Morimoto, C., Koons, D., Amir, A., and Flickner, M. 2000. Pupil detection and tracking using multiple light sources. IVC 18, 4, 331--335.]]Google ScholarGoogle ScholarCross RefCross Ref
  7. OpenEyes, 2005. http://hcvl.hci.iastate.edu/openeyes.]]Google ScholarGoogle Scholar
  8. Tobii, 2005. http://www.tobii.se/.]]Google ScholarGoogle Scholar

Index Terms

  1. Eye typing with common cameras

        Recommendations

        Comments

        Login options

        Check if you have access through your login credentials or your institution to get full access on this article.

        Sign in
        • Published in

          cover image ACM Conferences
          ETRA '06: Proceedings of the 2006 symposium on Eye tracking research & applications
          March 2006
          175 pages
          ISBN:1595933050
          DOI:10.1145/1117309

          Copyright © 2006 ACM

          Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

          Publisher

          Association for Computing Machinery

          New York, NY, United States

          Publication History

          • Published: 27 March 2006

          Permissions

          Request permissions about this article.

          Request Permissions

          Check for updates

          Qualifiers

          • Article

          Acceptance Rates

          Overall Acceptance Rate69of137submissions,50%

          Upcoming Conference

          ETRA '24
          The 2024 Symposium on Eye Tracking Research and Applications
          June 4 - 7, 2024
          Glasgow , United Kingdom

        PDF Format

        View or Download as a PDF file.

        PDF

        eReader

        View online with eReader.

        eReader