Skip to main content
Log in

Corneal Imaging System: Environment from Eyes

  • Published:
International Journal of Computer Vision Aims and scope Submit manuscript

Abstract

This paper provides a comprehensive analysis of exactly what visual information about the world is embedded within a single image of an eye. It turns out that the cornea of an eye and a camera viewing the eye form a catadioptric imaging system. We refer to this as a corneal imaging system. Unlike a typical catadioptric system, a corneal one is flexible in that the reflector (cornea) is not rigidly attached to the camera. Using a geometric model of the cornea based on anatomical studies, its 3D location and orientation can be estimated from a single image of the eye. Once this is done, a wide-angle view of the environment of the person can be obtained from the image. In addition, we can compute the projection of the environment onto the retina with its center aligned with the gaze direction. This foveated retinal image reveals what the person is looking at. We present a detailed analysis of the characteristics of the corneal imaging system including field of view, resolution and locus of viewpoints. When both eyes of a person are captured in an image, we have a stereo corneal imaging system. We analyze the epipolar geometry of this stereo system and show how it can be used to compute 3D structure. The framework we present in this paper for interpreting eye images is passive and non-invasive. It has direct implications for several fields including visual recognition, human-machine interfaces, computer graphics and human affect studies.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Similar content being viewed by others

References

  • Baker, S. and Nayar, S.K. 1999. A Theory of Single-Viewpoint Catadioptric Image Formation. IJCV, 35(2):1–22.

    Google Scholar 

  • Baker, T.Y. 1943. Ray tracing through non-spherical surfaces. Proc. of The Royal Society of London, 55:361–364.

    Google Scholar 

  • Blanz, V. and Vetter, T. 1999. A Morphable Model for the Synthesis of 3D Faces. In ACM SIGGRAPH 99.

  • Bolt, R.A. 1982. Eyes at the Interface. In ACM CHI, pages 360–362.

  • Burkhard, D.G. and Shealy, D.L. 1973. Flux Density for Ray Propagation in Geometrical Optics. JOSA, 63(3):299–304.

    Google Scholar 

  • Cornbleet, S. 1984. Microwave and Optical Ray Geometry. John Wiley and Sons

  • Daugman, J.G. 1993. High Confidence Visual Recognition of Persons by a Test of Statistical Independence. IEEE TPAMI, 15(11):1148–1161.

    Google Scholar 

  • Davson, H. 1990. Physiology of the Eye. Macmillan, 5th edition.

  • Debevec, P., Hawkins, T., Tchou, C., Duiker, H-P. and Sarokin, W. 2000. Acquiring the Reflectance Field of a Human Face. In ACM SIGGRAPH 00, pp. 145–156.

  • P. Debevec 1998. Rendering Synthetic Objects into Real Scenes: Bridging Traditional and Imagebased Graphics with Global Illumination and High Dynamic Range Photography. In ACM SIGGRAPH 98, pp. 189–198.

  • Ebisawa, Y. 1998. Improved Video-Based Eye-Gazed Detection Method. IEEE Trans. on Instrumentation and Measurement, 47(4):948–955.

    Google Scholar 

  • Ekman, P. and Rosenberg, E. L. editors 1997. What the Face Reveals. Oxford University Press, New York.

  • Ekman, P. 1993. Facial Expression of Emotion. American Psychologist, 48:384–392.

    Google Scholar 

  • Flom, L. and Safir, A. 1987. Iris Recognition System. US patent 4,641,349.

  • Halstead, M.A., Barsky, B.A., Klein, S.A., and Mandell, R.B. 1996. Reconstructing Curved Surfaces from Specular Reflection Patterns Using Spline Surface Fitting of Normals. In ACM SIGGRAPH 96, pp. 335–342.

  • Hutchinson, T.E., White, K.P., Reichert, K.C., and Frey, L.A. 1989. Human-computer Interaction using Eye-gaze Input. IEEE TSMC, 19:1527–1533.

    Google Scholar 

  • Ikeuchi, K. and Suehiro, T. 1994. Toward an Assembly Plan from Observation, Part 1: Task Recognition with Polyhedral Objects. IEEE Trans. Robotics and Automation, 10(3):368–385.

    Google Scholar 

  • Jacob, R. 1990. What You Look at is What You Get: Eye Movement-Based Interaction Techniques. In ACM CHI, pp. 11–18.

  • Kang, S.B. and Ikeuchi, K. 1997. Toward Automatic Robot Instruction from Perception–Mapping Human Grasps to Manipulator Grasps. IEEE Trans. on Robotics and Automation, 13(1).

  • Kaufman, P.L. and Alm, A. editors 2003. Adler’s Physiology of the Eye: Clinical Application. Mosby, 10th edition.

  • Marschner, S.R. and Greenberg, D.P. 1997. Inverse Lighting for Photography. In IS&T/SID Color Imaging Conference, pp. 262–265.

  • Nayar, S.K. 1988. Sphereo: Recovering depth using a single camera and two specular spheres. In SPIE: Optics, Illumination and Image Sensing for Machine Vision II.

  • Nene, S.A. and Nayar, S.K. 1998. Stereo Using Mirrors. In IEEE ICCV 98, pp. 1087–1094.

  • Nishino, K. and Nayar, S.K. 2004. Eyes for Relighting. ACM Trans. on Graphics (Proceedings of SIGGRAPH 2004), 23(3):704–711.

  • Nishino, K. and Nayar, S.K. 2004. The World in Eyes. In IEEE Conference on Computer Vision and Pattern Recognition, volume I, pp. 444–451.

  • Ohno, T., Mukawa, N. and Yoshikawa, A. 2002. FreeGaze: a gaze tracking system for everyday gaze interaction. In Proc. of the Symposium on ETRA, pp. 125–132.

  • Pajdla, T., Svoboda, T. and Hlaváč, V. 2000. Epipolar geometry of central panoramic cameras. In Panoramic Vision : Sensors, Theory, and Applications. Springer Verlag.

  • Stein, C.P. 1995. Accurate Internal Camera Calibration Using Rotation, with Analysis of Sources of Errors. In ICCV, pp. 230–236.

  • Stiefelhagen, R., Yang, J., and Waibel, A. 1997. A Model-Based Gaze-Tracking System. International Journal of Artificial Intelligence Tools, 6(2):193–209.

    Google Scholar 

  • Swaminathan, R., Grossberg, M.D., and Nayar, S.K. 2001. Caustics of Catadioptric Cameras. In IEEE ICCV 01, vol. II, pp. 2–9.

  • Tan, K-H., Kriegman, D.J., and Ahuja, N. 2002. Appearance-based Eye Gaze Estimation. In WACV, pp. 191–195.

  • Tomkins, S. S. 1962. Affect, imagery, consciousness. Springer, New York.

  • Tsumura, N., Dang, M.N., Makino, T., and Miyake, Y. 2003. Estimating the Directions to Light Sources Using Images of Eye for Reconstructing 3D Human Face. In Proc. of IS&T/SID’s Eleventh Color Imaging Conference, pp. 77–81.

  • von Helmholtz, H. 1909. Physiologic Optics, volume 1 and 2. Voss, Hamburg, Germany, third edition.

  • Wang, J-G., Sung, E., and Venkateswarlu, R. 2003. Eye Gaze Estimation from a Single Image of One Eye. In IEEE ICCV 03, pp. 136–143.

  • Wasserman, S. and Faust, K. 1994. Social Network Analysis: Methods and Applications. Cambridge University Press.

  • Westheimer, G. 1980. Medical Physiology, volume 1, chapter 16 The eye, pages 481–503. The C.V. Mosby Company.

  • Wolff, L.B. and Boult, T.E. 1991. Constraining Object Features Using a Polarization Reflectance Model. IEEE TPAMI, 13(7):635–657.

    Google Scholar 

  • Wolff, L.B. 1990. Polarization-based Material Classification from Specular Reflection. IEEE TPAMI, 12(11):1059–1071.

    Google Scholar 

  • Xu, L-Q., Machin, D. and Sheppard, P. 1998. A Novel Approach to Real-time Non-intrusive Gaze Finding. In BMVC, pp. 58–67.

  • Young, L.R. and Sheena, D. 1975. Survey of Eye Movement Recording Methods. Behavior Research Methods and Instrumentation, 7(5):397–429.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Ko Nishino.

Additional information

This research was conducted while the first author was affiliated with Columbia University. A shorter version of this paper appeared in (Nishino and Nayar, 2004).

Rights and permissions

Reprints and permissions

About this article

Cite this article

Nishino, K., Nayar, S.K. Corneal Imaging System: Environment from Eyes. Int J Comput Vision 70, 23–40 (2006). https://doi.org/10.1007/s11263-006-6274-9

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11263-006-6274-9

Keywords

Navigation