Skip to main content
Log in

Embedded intention scripts representation and real-time interpretation metrics extraction methodology with gaze annotation on visual content

  • Published:
Multimedia Tools and Applications Aims and scope Submit manuscript

Abstract

This paper proposes a system platform with a capability of analyzing and measuring the degree of similarities and differences of the gaze behaviors of the multiple users with the intention of the provider in real-time. In the proposed system platform, a video content incorporates the intention flow script, which describes the intended sequence and duration of the information embedded into the visual content. The video content also incorporates the intention weight script for assigning importance values to individual object information so that the significance of the gaze behavior can be assesses. In this paper, a set of evaluation metrics and their usages are defined, which can be used to evaluate the degree of gazing behavior closeness and deviation from the intention. The proposed method can be applied in many applications, such as in educational and commercial domains, where the information contents are usually designed with a specific intent. In the paper, we verified the proposed method by incorporation intention script into a sequence of educational image frames to analyze the gaze behavior of multiple users.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14
Fig. 15
Fig. 16
Fig. 17
Fig. 18
Fig. 19
Fig. 20

Similar content being viewed by others

References

  1. Bock G-W, Zmud RW, Kim Y-G, Lee J-N (2005) “Behavioral intention formation in knowledge sharing: examining the roles of extrinsic motivators”, social-psychological forces and organizational climate. MIS Q 29(1):87

    Google Scholar 

  2. Bratman ME (1999) Intention, plans, and practical reason. Harvard University Press, Cambridge

    Google Scholar 

  3. Buscher G, Dumais ST and Cutrell E (2010) The good, the bad, and the random: an eye-tracking study of ad quality in web search. In: Proc. of SIGIR, pp 42–49

  4. Castagnos S, Pu P (2010) Consumer decision patterns through eye gaze analysis. IUI 2010:1–10

    Google Scholar 

  5. Chua HF, Boland JE, Nisbett RE (2005) Cultural variation in eye movements during scene perception. In: Proc Natl Acad Sci U S A 102(35):12629–12633

  6. Dumais ST, Buscher G, and Cutrell E (2010) Individual differences in gaze patterns for web search. In: Proc. of IIiX. pp 185–194

  7. Eivazi S, Bednarik R, Tukiainen M, von und zu Fraunberg M, Leinonen V, and Jääskeläinen JE (2012) Gaze behavior of expert and novice micro neurosurgeons differs during observations of tumor removal recordings. In: Proceedings of the Symposium on Eye-tracking Research and Applications, pp. 377–380

  8. Hasse C, Grasshoff D and Bruder C (2012) How to measure monitoring performance of pilots and air traffic controllers. In: Proceedings of the Symposium on Eye-tracking Research and Applications, pp 409–412

  9. Komogortsev OV, Khan JI (2008) Predictive real-time perceptual compression based on Eye-gaze-position analysis. In: ACM Trans Multimed Comput Commun Appl 4(3):1–16

  10. Kuo Y-F, Yen S-N (2009) Towards an understanding of the behavioral intention to use 3G mobile value-added services. Comput Hum Behav 25(1):103–110

    Article  Google Scholar 

  11. Lagun D, Hsieh C-H, Webster D, Navalpakkam V (2014) Towards better measurement of attention and satisfaction in mobile search. Proceedings of the 37th International ACM SIGIR Conference on Research & Development in Information Retrieval

  12. Lin JC-C, Lu H (2000) Towards an understanding of the behavioural intention to use a web site. Int J Inf Manag 20(3):197–208

    Article  Google Scholar 

  13. Mei T, Hua X-S, Zhou H-Q, Li S (2007) Modeling and mining of users’ capture intention for home videos. IEEE Trans Multimedia 9(1):66–77

    Article  Google Scholar 

  14. Oh J-M, Hong S, Moon N (2014) Gaze behavior data profiling and analysis system platform based on visual content representation. Multimedia Tools Appl. doi:10.1007/s11042-014-2285-7

    Google Scholar 

  15. Rajashekar U, van der Linde I, Bovik AC, Cormack LK (2008) GAFFE: a gaze-attentive fixation finding engine. IEEE Trans Image Process 17(4):564–573

    Article  MathSciNet  Google Scholar 

  16. Sharif B, Falcone, M and Maletic JI (2012) An eye-tracking study on the role of scan time in finding source code defects. In: Proceedings of the Symposium on Eye-tracking Research and Applications, pp 381–384

  17. Sharma K, Jermann P, Dillenbourg P (2014) How students learn using MOOCs: an eye-tracking insight. In: Proceedings of European MOOCs Stakeholders Summit 1–9

  18. Tien G, Atkins MS, Zheng B and Swindells (2010) Measuring situation awareness of surgeons in laparoscopic training. In: Proceedings of the 2010 Symposium on Eye-Tracking Research & Applications, pp 149–152

  19. Yoshitaka A, Wakiyama K, Hirashima T (2006) Recommendation of visual information by gaze-based implicit preference acquisition. In: Lect Notes Comput Sci 4351:126–137

Download references

Acknowledgments

This research was supported by the ‘Cross-Ministry Giga KOREA Project’ of the Ministry of Science, ICT and Future Planning, Republic of Korea (ROK) [GK15P0100, Development of Tele-Experience Service SW Platform based on Giga Media].

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Nammee Moon.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Oh, JM., Hong, S. & Moon, N. Embedded intention scripts representation and real-time interpretation metrics extraction methodology with gaze annotation on visual content. Multimed Tools Appl 75, 7271–7291 (2016). https://doi.org/10.1007/s11042-015-2644-z

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11042-015-2644-z

Keywords

Navigation