Abstract
In the industrial environment such as aircraft cockpits and train driver’s cab we wished to real-timely acquire the eye-tracking position and made it synchronous with all controlled digital screens with which the machine could dynamically response to the user’ s current situation awareness (SA). Wearable eye-tracking glasses could only provide the relative position to the captured video, using which we gathered the data of the eye movement data (2DOF). While the motion capture device could only provide the position and orientation data, using which we accessed the displacement and angular displacement of the head (6DOF). We combined such two devices together into a novel real-time eye-interaction system to synchronize the user’s visual point on the screens. A spatial transform algorithm was proposed to calculate the visual point on the multiple digital screens. With the algorithm and the human factors analysis the machine could strengthen its dynamic service abilities.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Lv, Z., Wu, X.P., Li, M., Zhang, D.X.: A novel eye movement detection algorithm for EOG driven human computer interface. Pattern Recogn. Lett. 31, 1041–1047 (2010)
Aungsakun, S., Phinyomark, A., Phukpattaranont, P., Limsakul, C.: Robust eye movement recognition using EOG signal for human-computer interface. In: Zain, J.M., Wan Mohd, W.M.B., El-Qawasmeh, E. (eds.) Software Engineering and Computer Systems, pp. 714–723. Springer, Heidelberg (2011)
Ma, J., Zhang, Y., Cichocki, A., Matsuno, F.: A novel EOG/EEG hybrid human-machine interface adopting eye movements and ERPs: application to robot control. IEEE Trans. Biomed. Eng. 62, 876–889 (2015)
Rózanowski, K., Murawski, K.: An infrared sensor for eye tracking in a harsh car environment. Acta. Phys. Pol. A. 122, 874–879 (2012)
Murawski, K., Różanowski, K., Krej, M.: Research and parameter optimization of the pattern recognition algorithm for the eye tracking infrared sensor. Acta Phys. Pol. A 124, 513–516 (2013)
Hao, Z., Lei, Q.: Vision-based interface: using face and eye blinking tracking with camera. In: International Symposium on Intelligent Information Technology Application, vol. 1, pp. 306–310. IEEE (2008)
Lee, E.C., Min, W.P.: A new eye tracking method as a smartphone interface. Ksii Trans. Internet Inf. Syst. 7, 834–848 (2013)
Panev, S., Manolova, A.: Improved multi-camera 3D eye tracking for human-computer interface. In: International Conference on Intelligent Data Acquisition and Advanced Computing Systems: Technology and Applications, vol. 1, pp. 276–281. IEEE (2015)
Soltani, S., Mahnam, A.: A practical efficient human computer interface based on saccadic eye movements for people with disabilities. Comput. Biol. Med. 70, 163 (2016)
Lin, C.S., Ho, C.W., Chang, K.C., Hung, S.S., Shei, H.J., Yeh, M.S.: A novel device for head gesture measurement system in combination with eye-controlled human–machine interface. Opt. Laser. Eng. 44, 597–614 (2006)
Endsley, M.R.: Toward a theory of situation awareness in dynamic systems. Hum. Factors 37, 32–64 (1995)
Endsley, M.R., Selcon, S.J., Hardiman, T.D., Croft, D.G.: A comparative analysis of SAGAT and SART for evaluations of situation awareness. J. Surg. Res. 172, 231–232 (1998)
Itti, L., Koch, C., Niebur, E.: A model of saliency-based visual attention for rapid scene analysis. IEEE Trans. Pattern Anal. Mach. Intell. 20, 1254–1259 (1998)
Wickens, C.D., Helleberg, J., Goh, J., Xu, X., Horrey, W.J.: Pilot task management: testing an attentional expected value model of visual scanning. Technical report, UIUC Institute of Aviation (2001)
Acknowledgments
This paper is supported by the National Natural Science Foundation of China (Grant No. 51575037).
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2018 Springer International Publishing AG
About this paper
Cite this paper
Bao, H., Fang, W., Guo, B., Wang, P. (2018). Real-Time Eye-Interaction System Developed with Eye Tracking Glasses and Motion Capture. In: Ahram, T., Falcão, C. (eds) Advances in Human Factors in Wearable Technologies and Game Design. AHFE 2017. Advances in Intelligent Systems and Computing, vol 608. Springer, Cham. https://doi.org/10.1007/978-3-319-60639-2_8
Download citation
DOI: https://doi.org/10.1007/978-3-319-60639-2_8
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-60638-5
Online ISBN: 978-3-319-60639-2
eBook Packages: EngineeringEngineering (R0)