ABSTRACT
When evaluating eye tracking algorithms, a recurring issue is what metric to use and what data to compare against. User studies are informative when considering the entire eye tracking system, however they are often unsatisfactory for evaluating the gaze estimation algorithm in isolation. This is particularly an issue when evaluating a system's component parts, such as pupil detection, pupil-to-gaze mapping or head pose estimation.
Instead of user studies, eye tracking algorithms can be evaluated using simulated input video. We describe a computer graphics approach to creating realistic synthetic eye images, using a 3D model of the eye and head and a physically correct rendering technique. By using rendering, we have full control over the parameters of the scene such as the gaze vector or camera position, which allows the calculation of ground truth data, while creating a realistic input for a video-based gaze estimator.
- Baker, S., Scharstein, D., Lewis, J. P., Roth, S., Black, M. J., and Szeliski, R. 2010. A Database and Evaluation Methodology for Optical Flow. International Journal of Computer Vision 92, 1 (Nov.), 1--31. Google ScholarDigital Library
- Blender Foundation, 2013. Blender 2.69. http://www.blender.org/.Google Scholar
- Böhme, M., Dorr, M., Graw, M., Martinetz, T., and Barth, E. 2008. A Software Framework for Simulating Eye Trackers. In Proc. ETRA, no. 212. Google ScholarDigital Library
- Clarke, A. H., Ditterich, J., Drüen, K., Schönfeld, U., and Steineke, C. 2002. Using high frame rate CMOS sensors for three-dimensional eye tracking. Behavior research methods instruments & computers 34, 4, 549--560.Google Scholar
- Crane, H. D., and Steele, C. M. 1978. Accurate three-dimensional eyetracker. Applied optics 17, 5 (Mar.), 691--705.Google Scholar
- Donath, A., and Kondermann, D. 2013. Is Crowdsourcing for Optical Flow Ground Truth Generation Feasible? In Proc. ICVS, Springer, M. Chen, B. Leibe, and B. Neumann, Eds., vol. 7963 of Lecture Notes in Computer Science, 193--202. Google ScholarDigital Library
- Hansen, D. W., and Ji, Q. 2010. In the eye of the beholder: a survey of models for eyes and gaze. IEEE Trans. PAMI 32, 3 (Mar.), 478--500. Google ScholarDigital Library
- Haustein, W. 1989. Considerations on Listing's Law and the primary position by means of a matrix description of eye position control. Biological Cybernetics 60, 6, 411--420.Google ScholarDigital Library
- Holmberg, N., 2012. Advance head rig. http://www.blendswap.com/blends/view/48717.Google Scholar
- Imai, T., Sekine, K., Hattori, K., Takeda, N., Koizuka, I., Nakamae, K., Miura, K., Fujioka, H., and Kubo, T. 2005. Comparing the accuracy of video-oculography and the scleral search coil system in human eye movement analysis. Auris Nasus Larynx 32, 1, 3--9.Google ScholarCross Ref
- Kajiya, J. T. 1986. The rendering equation. Computer Graphics 20, 4, 143--150. Google ScholarDigital Library
- Moore, S. T., Haslwanter, T., Curthoys, I. S., and Smith, S. T. 1996. A geometric basis for measurement of three-dimensional eye position using image processing. Vision research 36, 3 (Feb.), 445--459.Google Scholar
- Morimoto, C. H., and Mimica, M. R. M. 2005. Eye gaze tracking techniques for interactive applications. Computer Vision and Image Understanding 98, 1 (Apr.), 4--24. Google ScholarDigital Library
- Świrski, L., Bulling, A., and Dodgson, N. 2012. Robust real-time pupil tracking in highly off-axis images. In Proc. ETRA. Google ScholarDigital Library
- Tsukada, A., Shino, M., Devyver, M. S., and Kanade, T. 2011. Illumination-free gaze estimation method for first-person vision wearable device. Computer Vision in Vehicle Technology.Google Scholar
- Villanueva, A., Cabeza, R., and Porta, S. 2006. Eye tracking: Pupil orientation geometrical modeling. Image and Vision Computing 24, 7 (July), 663--679.Google ScholarCross Ref
- Wang, J.-G., Sung, E., and Venkateswarlu, R. 2005. Estimating the eye gaze from one eye. Computer Vision and Image Understanding 98, 1, 83--103. Google ScholarDigital Library
- Zhu, D., Moore, S. T., and Raphan, T. 1999. Robust pupil center detection using a curvature algorithm. Computer methods and programs in biomedicine 59, 3 (June), 145--57.Google Scholar
Index Terms
- Rendering synthetic ground truth images for eye tracker evaluation
Recommendations
A novel approach to single camera, glint-free 3D eye model fitting including corneal refraction
ETRA '18: Proceedings of the 2018 ACM Symposium on Eye Tracking Research & ApplicationsModel-based methods for glint-free gaze estimation typically infer eye pose using pupil contours extracted from eye images. Existing methods, however, either ignore or require complex hardware setups to deal with refraction effects occurring at the ...
Rendering refraction and reflection of eyeglasses for synthetic eye tracker images
ETRA '16: Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & ApplicationsWhile for the evaluation of robustness of eye tracking algorithms the use of real-world data is essential, there are many applications where simulated, synthetic eye images are of advantage. They can generate labelled ground-truth data for appearance ...
Compensating for eye tracker camera movement
ETRA '06: Proceedings of the 2006 symposium on Eye tracking research & applicationsAn algorithm was developed to improve prediction of eye position from video-based eye tracker data. Eye trackers that determine eye position relying on images of pupil and corneal reflection positions typically make poor differentiation between changes ...
Comments