Abstract
This paper proposes a system platform with a capability of analyzing and measuring the degree of similarities and differences of the gaze behaviors of the multiple users with the intention of the provider in real-time. In the proposed system platform, a video content incorporates the intention flow script, which describes the intended sequence and duration of the information embedded into the visual content. The video content also incorporates the intention weight script for assigning importance values to individual object information so that the significance of the gaze behavior can be assesses. In this paper, a set of evaluation metrics and their usages are defined, which can be used to evaluate the degree of gazing behavior closeness and deviation from the intention. The proposed method can be applied in many applications, such as in educational and commercial domains, where the information contents are usually designed with a specific intent. In the paper, we verified the proposed method by incorporation intention script into a sequence of educational image frames to analyze the gaze behavior of multiple users.
Similar content being viewed by others
References
Bock G-W, Zmud RW, Kim Y-G, Lee J-N (2005) “Behavioral intention formation in knowledge sharing: examining the roles of extrinsic motivators”, social-psychological forces and organizational climate. MIS Q 29(1):87
Bratman ME (1999) Intention, plans, and practical reason. Harvard University Press, Cambridge
Buscher G, Dumais ST and Cutrell E (2010) The good, the bad, and the random: an eye-tracking study of ad quality in web search. In: Proc. of SIGIR, pp 42–49
Castagnos S, Pu P (2010) Consumer decision patterns through eye gaze analysis. IUI 2010:1–10
Chua HF, Boland JE, Nisbett RE (2005) Cultural variation in eye movements during scene perception. In: Proc Natl Acad Sci U S A 102(35):12629–12633
Dumais ST, Buscher G, and Cutrell E (2010) Individual differences in gaze patterns for web search. In: Proc. of IIiX. pp 185–194
Eivazi S, Bednarik R, Tukiainen M, von und zu Fraunberg M, Leinonen V, and Jääskeläinen JE (2012) Gaze behavior of expert and novice micro neurosurgeons differs during observations of tumor removal recordings. In: Proceedings of the Symposium on Eye-tracking Research and Applications, pp. 377–380
Hasse C, Grasshoff D and Bruder C (2012) How to measure monitoring performance of pilots and air traffic controllers. In: Proceedings of the Symposium on Eye-tracking Research and Applications, pp 409–412
Komogortsev OV, Khan JI (2008) Predictive real-time perceptual compression based on Eye-gaze-position analysis. In: ACM Trans Multimed Comput Commun Appl 4(3):1–16
Kuo Y-F, Yen S-N (2009) Towards an understanding of the behavioral intention to use 3G mobile value-added services. Comput Hum Behav 25(1):103–110
Lagun D, Hsieh C-H, Webster D, Navalpakkam V (2014) Towards better measurement of attention and satisfaction in mobile search. Proceedings of the 37th International ACM SIGIR Conference on Research & Development in Information Retrieval
Lin JC-C, Lu H (2000) Towards an understanding of the behavioural intention to use a web site. Int J Inf Manag 20(3):197–208
Mei T, Hua X-S, Zhou H-Q, Li S (2007) Modeling and mining of users’ capture intention for home videos. IEEE Trans Multimedia 9(1):66–77
Oh J-M, Hong S, Moon N (2014) Gaze behavior data profiling and analysis system platform based on visual content representation. Multimedia Tools Appl. doi:10.1007/s11042-014-2285-7
Rajashekar U, van der Linde I, Bovik AC, Cormack LK (2008) GAFFE: a gaze-attentive fixation finding engine. IEEE Trans Image Process 17(4):564–573
Sharif B, Falcone, M and Maletic JI (2012) An eye-tracking study on the role of scan time in finding source code defects. In: Proceedings of the Symposium on Eye-tracking Research and Applications, pp 381–384
Sharma K, Jermann P, Dillenbourg P (2014) How students learn using MOOCs: an eye-tracking insight. In: Proceedings of European MOOCs Stakeholders Summit 1–9
Tien G, Atkins MS, Zheng B and Swindells (2010) Measuring situation awareness of surgeons in laparoscopic training. In: Proceedings of the 2010 Symposium on Eye-Tracking Research & Applications, pp 149–152
Yoshitaka A, Wakiyama K, Hirashima T (2006) Recommendation of visual information by gaze-based implicit preference acquisition. In: Lect Notes Comput Sci 4351:126–137
Acknowledgments
This research was supported by the ‘Cross-Ministry Giga KOREA Project’ of the Ministry of Science, ICT and Future Planning, Republic of Korea (ROK) [GK15P0100, Development of Tele-Experience Service SW Platform based on Giga Media].
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Oh, JM., Hong, S. & Moon, N. Embedded intention scripts representation and real-time interpretation metrics extraction methodology with gaze annotation on visual content. Multimed Tools Appl 75, 7271–7291 (2016). https://doi.org/10.1007/s11042-015-2644-z
Received:
Revised:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11042-015-2644-z