ABSTRACT
Gaze tracking is a common technique to study user interaction but is also increasingly used as input modality. In this regard, computer vision based systems provide a promising low-cost realization of gaze tracking on mobile devices. This paper complements related work focusing on algorithmic designs by conducting two users studies aiming to i) independently evaluate EyeTab as promising gaze tracking approach and ii) by providing the first independent use case driven evaluation of its applicability in mobile scenarios. Our evaluation elucidates the current state of mobile computer vision based gaze tracking and aims to pave the way for improved algorithms. In this regard, we aim to further foster the development by releasing our source data as reference database open to the public.
- EyetrackingDB. http://eyetrackingdb.github.io/ or http://eyetrackingdb.ohohlfeld.com.Google Scholar
- Gaze Tracking Framework. https://github.com/eyetrackingDB/GazeTrackingFramework.Google Scholar
- NormMaker. https://github.com/eyetrackingDB/NormMaker.Google Scholar
- Andrienko, G., Andrienko, N., Burch, M., and Weiskopf, D. Visual analytics methodology for eye movement studies. IEEE Transactions on Visualization and Computer Graphics 18, 12 (Dec 2012), 2889--2898. Google ScholarDigital Library
- Brown, A., Evans, M., Jay, C., Glancy, M., Jones, R., and Harper, S. Hci over multiple screens. In CHI Extended Abstracts (2014). Google ScholarDigital Library
- Bulling, A., and Gellersen, H. Toward mobile eye-based human-computer interaction. IEEE Pervasive Computing 9, 4 (Oct. 2010), 8--12. Google ScholarDigital Library
- Bulling, A., Roggen, D., and Tröster, G. Wearable EOG Goggles: Eye-based interaction in everyday environments. In CHI Extended Abstracts (2009). Google ScholarDigital Library
- Drewes, H., De Luca, A., and Schmidt, A. Eye-gaze interaction for mobile phones. In Mobility (2007). Google ScholarDigital Library
- Heim, S., Pape-Neumann, J., van Ermingen-Marbach, M., Brinkhaus, M., and Grande, M. Shared vs. specific brain activation changes in dyslexia after training of phonology, attention, or reading. Brain Structure and Function (2014), 1--17.Google Scholar
- Hillen, R., Günther, T., Kohlen, C., Eckers, C., van Ermingen-Marbach, M., Sass, K., Scharke, W., Vollmar, J., Radach, R., and Heim, S. Identifying brain systems for gaze orienting during reading: fmri investigation of the landolt paradigm. Frontiers in Human Neuroscience 7 (2013), 384.Google ScholarCross Ref
- Holland, C., Garza, A., Kurtova, E., Cruz, J., and Komogortsev, O. Usability evaluation of eye tracking on an unmodified common tablet. In CHI Extended Abstracts (2013). Google ScholarDigital Library
- Holland, C., and Komogortsev, O. Eye tracking on unmodified common tablets: Challenges and solutions. In Symposium on Eye-Tracking Research & Applications (2012). Google ScholarDigital Library
- Hume, T. EyeLike - OpenCV based webcam gaze tracker.Google Scholar
- Hutzler, F., and Wimmer, H. Eye movements of dyslexic children when reading in a regular orthography. Brain Lang 89, 1 (2004), 35--242.Google ScholarCross Ref
- Ishimaru, S., Kunze, K., Utsumi, Y., Iwamura, M., and Kise, K. Where are you looking at? - feature-based eye tracking on unmodified tablets. In ACPR (2013). Google ScholarDigital Library
- Kiefer, P., Giannopoulos, I., Kremer, D., Schlieder, C., and Raubal, M. Starting to get bored: An outdoor eye tracking study of tourists exploring a city panorama. In Symposium on Eye-Tracking Research & Applications (2014). Google ScholarDigital Library
- Kinnunen, T., Sedlak, F., and Bednarik, R. Towards task-independent person authentication using eye movement signals. In Symposium on Eye-Tracking Research & Applications (2010). Google ScholarDigital Library
- Klische, A. Leseschwächen gezielt beheben. PhD thesis, Ludwig-Maximilians-Universität München, December 2006. in German.Google Scholar
- Kunze, K., Utsumi, Y., Shiga, Y., Kise, K., and Bulling, A. I know what you are reading: Recognition of document types using mobile eye tracking. In International Symposium on Wearable Computers (2013). Google ScholarDigital Library
- Pape-Neumann, J., van Ermingen-Marbach, M., Verhalen, N., Heim, S., and Grande, M. Rapid automatized naming, processing speed, and reading fluency. Sprache Stimme Gehör 39, 01 (2015), 30--35. in German.Google Scholar
- Prieto, L. P., Wen, Y., Caballero, D., Sharma, K., and Dillenbourg, P. Studying teacher cognitive load in multi-tabletop classrooms using mobile eye-tracking. In ACM Conference on Interactive Tabletops and Surfaces (2014). Google ScholarDigital Library
- Repscher, S., Grande, M., Heim, S., van Ermingen, M., and Pape-Neumann, J. Developing parallelised word lists for a repeated testing of dyslectic children. Sprache Stimme Gehör 36, 01 (2012), 33--39. in German.Google Scholar
- Schneps, M. H., Thomson, J. M., Sonnert, G., Pomplun, M., Chen, C., and Heffner-Wong, A. Shorter lines facilitate reading in those who struggle. PloS ONE 8, 8 (2013), e71161.Google ScholarCross Ref
- Seshadrinathan, K., Soundararajan, R., Bovik, A. C., and Cormack, L. K. Study of subjective and objective quality assessment of video. Trans. Img. Proc. 19, 6 (June 2010), 1427--1441. Google ScholarDigital Library
- Soleymani, M., Lichtenauer, J., Pun, T., and Pantic, M. A multimodal database for affect recognition and implicit tagging. IEEE Transactions on Affective Computing 3, 1 (2012), 42--55. Google ScholarDigital Library
- Timm, F., and Barth, E. Accurate eye centre localisation by means of gradients. In VISAPP (2011).Google Scholar
- Turner, J., Bulling, A., and Gellersen, H. Extending the visual field of a head-mounted eye tracker for pervasive eye-based interaction. In Symposium on Eye-Tracking Research & Applications (2012). Google ScholarDigital Library
- Vaitukaitis, V., and Bulling, A. Eye gesture recognition on portable devices. In ACM UbiComp (2012). Google ScholarDigital Library
- Valenti, R., and Gevers, T. Accurate eye center location and tracking using isophote curvature. In CVPR (2008).Google ScholarCross Ref
- van Ermingen-Marbach, M., Verhalen, N., Grande, M., Heim, S., Mayer, A., and Pape-Neumann, J. Standards for rapid automatised naming performances in normal reading children at the age of 9--11. Sprache Stimme Gehör 38, 04 (2014), e28--e32. in German.Google Scholar
- Wood, E. Gaze tracking for commodity portable devices. Master's thesis, Gonville and Caius College - University of Cambridge, 2013.Google Scholar
- Wood, E., and Bulling, A. EyeTab source code.Google Scholar
- Wood, E., and Bulling, A. EyeTab: Model-based gaze estimation on unmodified tablet computers. In Symposium on Eye-Tracking Research & Applications (2014). Google ScholarDigital Library
Index Terms
- On the Applicability of Computer Vision based Gaze Tracking in Mobile Scenarios
Recommendations
Mobile 3D Gaze Tracking Calibration
CRV '15: Proceedings of the 2015 12th Conference on Computer and Robot VisionWe present a new calibration method to combine a mobile eye tracker with an external tracking system to obtain a 3D gaze vector. Our method captures calibration points of varying distances, pupil positions and head positions/orientations. With these ...
A Simplified 3D Gaze Tracking Technology with Stereo Vision
ICOIP '10: Proceedings of the 2010 International Conference on Optoelectronics and Image Processing - Volume 01A simplified 3D gaze tracking technology with stereo vision has been developed in this paper. A pair of stereo cameras and two point light sources are used to estimate 3D gaze of user's eye. Compared with other 3D systems, there are two improvements to ...
Interacting with objects in the environment using gaze tracking glasses and speech
OzCHI '14: Proceedings of the 26th Australian Computer-Human Interaction Conference on Designing Futures: the Future of DesignThis work explores the combination of gaze and speech to interact with objects in the environment. A head-mounted wireless gaze tracker in the form of gaze tracking glasses is used for mobile monitoring of a subject's point of regard on the surrounding ...
Comments