Skip to main content

Real-Time Eye-Interaction System Developed with Eye Tracking Glasses and Motion Capture

  • Conference paper
  • First Online:
Advances in Human Factors in Wearable Technologies and Game Design (AHFE 2017)

Part of the book series: Advances in Intelligent Systems and Computing ((AISC,volume 608))

Included in the following conference series:

  • 1897 Accesses

Abstract

In the industrial environment such as aircraft cockpits and train driver’s cab we wished to real-timely acquire the eye-tracking position and made it synchronous with all controlled digital screens with which the machine could dynamically response to the user’ s current situation awareness (SA). Wearable eye-tracking glasses could only provide the relative position to the captured video, using which we gathered the data of the eye movement data (2DOF). While the motion capture device could only provide the position and orientation data, using which we accessed the displacement and angular displacement of the head (6DOF). We combined such two devices together into a novel real-time eye-interaction system to synchronize the user’s visual point on the screens. A spatial transform algorithm was proposed to calculate the visual point on the multiple digital screens. With the algorithm and the human factors analysis the machine could strengthen its dynamic service abilities.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 129.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 169.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Lv, Z., Wu, X.P., Li, M., Zhang, D.X.: A novel eye movement detection algorithm for EOG driven human computer interface. Pattern Recogn. Lett. 31, 1041–1047 (2010)

    Article  Google Scholar 

  2. Aungsakun, S., Phinyomark, A., Phukpattaranont, P., Limsakul, C.: Robust eye movement recognition using EOG signal for human-computer interface. In: Zain, J.M., Wan Mohd, W.M.B., El-Qawasmeh, E. (eds.) Software Engineering and Computer Systems, pp. 714–723. Springer, Heidelberg (2011)

    Chapter  Google Scholar 

  3. Ma, J., Zhang, Y., Cichocki, A., Matsuno, F.: A novel EOG/EEG hybrid human-machine interface adopting eye movements and ERPs: application to robot control. IEEE Trans. Biomed. Eng. 62, 876–889 (2015)

    Article  Google Scholar 

  4. Rózanowski, K., Murawski, K.: An infrared sensor for eye tracking in a harsh car environment. Acta. Phys. Pol. A. 122, 874–879 (2012)

    Article  Google Scholar 

  5. Murawski, K., Różanowski, K., Krej, M.: Research and parameter optimization of the pattern recognition algorithm for the eye tracking infrared sensor. Acta Phys. Pol. A 124, 513–516 (2013)

    Article  Google Scholar 

  6. Hao, Z., Lei, Q.: Vision-based interface: using face and eye blinking tracking with camera. In: International Symposium on Intelligent Information Technology Application, vol. 1, pp. 306–310. IEEE (2008)

    Google Scholar 

  7. Lee, E.C., Min, W.P.: A new eye tracking method as a smartphone interface. Ksii Trans. Internet Inf. Syst. 7, 834–848 (2013)

    Article  Google Scholar 

  8. Panev, S., Manolova, A.: Improved multi-camera 3D eye tracking for human-computer interface. In: International Conference on Intelligent Data Acquisition and Advanced Computing Systems: Technology and Applications, vol. 1, pp. 276–281. IEEE (2015)

    Google Scholar 

  9. Soltani, S., Mahnam, A.: A practical efficient human computer interface based on saccadic eye movements for people with disabilities. Comput. Biol. Med. 70, 163 (2016)

    Article  Google Scholar 

  10. Lin, C.S., Ho, C.W., Chang, K.C., Hung, S.S., Shei, H.J., Yeh, M.S.: A novel device for head gesture measurement system in combination with eye-controlled human–machine interface. Opt. Laser. Eng. 44, 597–614 (2006)

    Article  Google Scholar 

  11. Endsley, M.R.: Toward a theory of situation awareness in dynamic systems. Hum. Factors 37, 32–64 (1995)

    Article  Google Scholar 

  12. Endsley, M.R., Selcon, S.J., Hardiman, T.D., Croft, D.G.: A comparative analysis of SAGAT and SART for evaluations of situation awareness. J. Surg. Res. 172, 231–232 (1998)

    Google Scholar 

  13. Itti, L., Koch, C., Niebur, E.: A model of saliency-based visual attention for rapid scene analysis. IEEE Trans. Pattern Anal. Mach. Intell. 20, 1254–1259 (1998)

    Article  Google Scholar 

  14. Wickens, C.D., Helleberg, J., Goh, J., Xu, X., Horrey, W.J.: Pilot task management: testing an attentional expected value model of visual scanning. Technical report, UIUC Institute of Aviation (2001)

    Google Scholar 

Download references

Acknowledgments

This paper is supported by the National Natural Science Foundation of China (Grant No. 51575037).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Weining Fang .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2018 Springer International Publishing AG

About this paper

Cite this paper

Bao, H., Fang, W., Guo, B., Wang, P. (2018). Real-Time Eye-Interaction System Developed with Eye Tracking Glasses and Motion Capture. In: Ahram, T., Falcão, C. (eds) Advances in Human Factors in Wearable Technologies and Game Design. AHFE 2017. Advances in Intelligent Systems and Computing, vol 608. Springer, Cham. https://doi.org/10.1007/978-3-319-60639-2_8

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-60639-2_8

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-60638-5

  • Online ISBN: 978-3-319-60639-2

  • eBook Packages: EngineeringEngineering (R0)

Publish with us

Policies and ethics