skip to main content
10.1145/2785830.2785869acmconferencesArticle/Chapter ViewAbstractPublication PagesmobilehciConference Proceedingsconference-collections
research-article

On the Applicability of Computer Vision based Gaze Tracking in Mobile Scenarios

Published:24 August 2015Publication History

ABSTRACT

Gaze tracking is a common technique to study user interaction but is also increasingly used as input modality. In this regard, computer vision based systems provide a promising low-cost realization of gaze tracking on mobile devices. This paper complements related work focusing on algorithmic designs by conducting two users studies aiming to i) independently evaluate EyeTab as promising gaze tracking approach and ii) by providing the first independent use case driven evaluation of its applicability in mobile scenarios. Our evaluation elucidates the current state of mobile computer vision based gaze tracking and aims to pave the way for improved algorithms. In this regard, we aim to further foster the development by releasing our source data as reference database open to the public.

References

  1. EyetrackingDB. http://eyetrackingdb.github.io/ or http://eyetrackingdb.ohohlfeld.com.Google ScholarGoogle Scholar
  2. Gaze Tracking Framework. https://github.com/eyetrackingDB/GazeTrackingFramework.Google ScholarGoogle Scholar
  3. NormMaker. https://github.com/eyetrackingDB/NormMaker.Google ScholarGoogle Scholar
  4. Andrienko, G., Andrienko, N., Burch, M., and Weiskopf, D. Visual analytics methodology for eye movement studies. IEEE Transactions on Visualization and Computer Graphics 18, 12 (Dec 2012), 2889--2898. Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. Brown, A., Evans, M., Jay, C., Glancy, M., Jones, R., and Harper, S. Hci over multiple screens. In CHI Extended Abstracts (2014). Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. Bulling, A., and Gellersen, H. Toward mobile eye-based human-computer interaction. IEEE Pervasive Computing 9, 4 (Oct. 2010), 8--12. Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. Bulling, A., Roggen, D., and Tröster, G. Wearable EOG Goggles: Eye-based interaction in everyday environments. In CHI Extended Abstracts (2009). Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. Drewes, H., De Luca, A., and Schmidt, A. Eye-gaze interaction for mobile phones. In Mobility (2007). Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. Heim, S., Pape-Neumann, J., van Ermingen-Marbach, M., Brinkhaus, M., and Grande, M. Shared vs. specific brain activation changes in dyslexia after training of phonology, attention, or reading. Brain Structure and Function (2014), 1--17.Google ScholarGoogle Scholar
  10. Hillen, R., Günther, T., Kohlen, C., Eckers, C., van Ermingen-Marbach, M., Sass, K., Scharke, W., Vollmar, J., Radach, R., and Heim, S. Identifying brain systems for gaze orienting during reading: fmri investigation of the landolt paradigm. Frontiers in Human Neuroscience 7 (2013), 384.Google ScholarGoogle ScholarCross RefCross Ref
  11. Holland, C., Garza, A., Kurtova, E., Cruz, J., and Komogortsev, O. Usability evaluation of eye tracking on an unmodified common tablet. In CHI Extended Abstracts (2013). Google ScholarGoogle ScholarDigital LibraryDigital Library
  12. Holland, C., and Komogortsev, O. Eye tracking on unmodified common tablets: Challenges and solutions. In Symposium on Eye-Tracking Research & Applications (2012). Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. Hume, T. EyeLike - OpenCV based webcam gaze tracker.Google ScholarGoogle Scholar
  14. Hutzler, F., and Wimmer, H. Eye movements of dyslexic children when reading in a regular orthography. Brain Lang 89, 1 (2004), 35--242.Google ScholarGoogle ScholarCross RefCross Ref
  15. Ishimaru, S., Kunze, K., Utsumi, Y., Iwamura, M., and Kise, K. Where are you looking at? - feature-based eye tracking on unmodified tablets. In ACPR (2013). Google ScholarGoogle ScholarDigital LibraryDigital Library
  16. Kiefer, P., Giannopoulos, I., Kremer, D., Schlieder, C., and Raubal, M. Starting to get bored: An outdoor eye tracking study of tourists exploring a city panorama. In Symposium on Eye-Tracking Research & Applications (2014). Google ScholarGoogle ScholarDigital LibraryDigital Library
  17. Kinnunen, T., Sedlak, F., and Bednarik, R. Towards task-independent person authentication using eye movement signals. In Symposium on Eye-Tracking Research & Applications (2010). Google ScholarGoogle ScholarDigital LibraryDigital Library
  18. Klische, A. Leseschwächen gezielt beheben. PhD thesis, Ludwig-Maximilians-Universität München, December 2006. in German.Google ScholarGoogle Scholar
  19. Kunze, K., Utsumi, Y., Shiga, Y., Kise, K., and Bulling, A. I know what you are reading: Recognition of document types using mobile eye tracking. In International Symposium on Wearable Computers (2013). Google ScholarGoogle ScholarDigital LibraryDigital Library
  20. Pape-Neumann, J., van Ermingen-Marbach, M., Verhalen, N., Heim, S., and Grande, M. Rapid automatized naming, processing speed, and reading fluency. Sprache Stimme Gehör 39, 01 (2015), 30--35. in German.Google ScholarGoogle Scholar
  21. Prieto, L. P., Wen, Y., Caballero, D., Sharma, K., and Dillenbourg, P. Studying teacher cognitive load in multi-tabletop classrooms using mobile eye-tracking. In ACM Conference on Interactive Tabletops and Surfaces (2014). Google ScholarGoogle ScholarDigital LibraryDigital Library
  22. Repscher, S., Grande, M., Heim, S., van Ermingen, M., and Pape-Neumann, J. Developing parallelised word lists for a repeated testing of dyslectic children. Sprache Stimme Gehör 36, 01 (2012), 33--39. in German.Google ScholarGoogle Scholar
  23. Schneps, M. H., Thomson, J. M., Sonnert, G., Pomplun, M., Chen, C., and Heffner-Wong, A. Shorter lines facilitate reading in those who struggle. PloS ONE 8, 8 (2013), e71161.Google ScholarGoogle ScholarCross RefCross Ref
  24. Seshadrinathan, K., Soundararajan, R., Bovik, A. C., and Cormack, L. K. Study of subjective and objective quality assessment of video. Trans. Img. Proc. 19, 6 (June 2010), 1427--1441. Google ScholarGoogle ScholarDigital LibraryDigital Library
  25. Soleymani, M., Lichtenauer, J., Pun, T., and Pantic, M. A multimodal database for affect recognition and implicit tagging. IEEE Transactions on Affective Computing 3, 1 (2012), 42--55. Google ScholarGoogle ScholarDigital LibraryDigital Library
  26. Timm, F., and Barth, E. Accurate eye centre localisation by means of gradients. In VISAPP (2011).Google ScholarGoogle Scholar
  27. Turner, J., Bulling, A., and Gellersen, H. Extending the visual field of a head-mounted eye tracker for pervasive eye-based interaction. In Symposium on Eye-Tracking Research & Applications (2012). Google ScholarGoogle ScholarDigital LibraryDigital Library
  28. Vaitukaitis, V., and Bulling, A. Eye gesture recognition on portable devices. In ACM UbiComp (2012). Google ScholarGoogle ScholarDigital LibraryDigital Library
  29. Valenti, R., and Gevers, T. Accurate eye center location and tracking using isophote curvature. In CVPR (2008).Google ScholarGoogle ScholarCross RefCross Ref
  30. van Ermingen-Marbach, M., Verhalen, N., Grande, M., Heim, S., Mayer, A., and Pape-Neumann, J. Standards for rapid automatised naming performances in normal reading children at the age of 9--11. Sprache Stimme Gehör 38, 04 (2014), e28--e32. in German.Google ScholarGoogle Scholar
  31. Wood, E. Gaze tracking for commodity portable devices. Master's thesis, Gonville and Caius College - University of Cambridge, 2013.Google ScholarGoogle Scholar
  32. Wood, E., and Bulling, A. EyeTab source code.Google ScholarGoogle Scholar
  33. Wood, E., and Bulling, A. EyeTab: Model-based gaze estimation on unmodified tablet computers. In Symposium on Eye-Tracking Research & Applications (2014). Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. On the Applicability of Computer Vision based Gaze Tracking in Mobile Scenarios

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in
    • Published in

      cover image ACM Conferences
      MobileHCI '15: Proceedings of the 17th International Conference on Human-Computer Interaction with Mobile Devices and Services
      August 2015
      611 pages
      ISBN:9781450336529
      DOI:10.1145/2785830

      Copyright © 2015 ACM

      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 24 August 2015

      Permissions

      Request permissions about this article.

      Request Permissions

      Check for updates

      Qualifiers

      • research-article
      • Research
      • Refereed limited

      Acceptance Rates

      Overall Acceptance Rate202of906submissions,22%

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader