Skip to main content
Log in

Visual search analysis using parametric fixations

  • Published:
Multimedia Tools and Applications Aims and scope Submit manuscript

Abstract

Eye movement analysis has been an important area of research. The analysis of eye movements could help in better understanding of human preferences, behavioral pattern and viewing sequences. Eye movement analysis involves scanpath tracking which is the path traversed by movement of eyes. Scanpath comprises of two major components that are fixations and saccades. Fixations are the common events studied for understanding behavioral tendencies whereas saccades have generally been referred as corrective random eye movements that cannot contribute to human behavioral understanding. The constant viewing of a region or an object in a visual scene for a certain period of time is termed as fixation that has been generally analyzed in terms of associated parameters. The most common parameters analyzed include, number of fixations and duration of fixations. The analysis of fixations has been a part of many research areas that include scene perception, visual search, marketing, diagnostic and interactive applications. Visual search has been a prominent research area wherein a given target objects is to be found amidst group of distracters in a displayed scene/image. The distractions could be due to presence of similar looking objects, low variation of color or higher number of heterogeneous objects. In this paper, fixations have been analyzed in terms of different fixation parameters during experiments of visual search. A total of three experiments of visual search have been conducted. The experiments have been conducted in a sequence, for understanding the impact of distractions during visual search on parameters of fixations and thereby eye movements. The identified parameters include ‘number of fixations, ‘total fixation duration’, ‘maximum fixation duration’ and ‘total search time’. A total of three distractions present in images i.e. low chromatic variation, presence of multiple heterogeneous objects and high target- distracter similarity have been analyzed. Eye movement data of forty one subjects has been captured using a remote eye tracking setup. It has been found that visual images with high target and non target object similarity have higher impact on fixation parameters and thus eye movements. The distractions in the form of low chromatic variations and presence of multiple heterogenic objects in images have almost similar impact on eye movements. The results indicate that searching in images with high similarity between target and non target objects has been difficult for subjects in comparison to images with high heterogenic component and low chromatic variation. Also, result of parameter (maximum fixation duration) shows different results on images that have been easy for subjects to search, in comparison to difficult visual search. The obtained results could lead to better development and optimization of user oriented applications using remote eye tracking systems.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11

Similar content being viewed by others

References

  1. Aksum KM, Magnaguagno L, Bjorndal CT, Jordet G (2020) What do football players look at? An eye-tracking analysis of the visual fixations of players in 11 v 11 elite football match play. Front Psychol 11:1–11

    Article  Google Scholar 

  2. Alexander RG, Zelinsky GJ (2012) Effects of part-based similarity on visual search: The Frankenbear experiment. Vis Res 54:20–30

  3. Boettcher S, Wolfe JM (2015) Searching for the right word: Hybrid visual and memory search for words. Atten Percept Psycho 77:1132–1142

    Article  Google Scholar 

  4. Brown AM, Lindsey DT, Guckes KM (2011) Color names, color categories, and color-cued visual search: Sometimes, color perception is not categorical. J Vis 11:2–12

    Article  Google Scholar 

  5. Brunye TT, Drew T, Weaver DL, Elmore JG (2019) A review of eye tracking for understanding and improving diagnostic interpretation. Cogn Res: Princ Implic 4:1–16

    Article  Google Scholar 

  6. Bulling A, Ward JA, Gellersen H, Troster G (2010) Eye movement analysis for activity recognition using electrooculography. IEEE Trans Pattern Anal Mach Intell 33:741–753

    Article  Google Scholar 

  7. Cho J, Chong SC (2019) Search termination when the target is absent: The prevalence of coarse processing and its intertrial influence. J Exp Psychol Hum Percept Perform 45:1443–1455

    Article  Google Scholar 

  8. Coutrot A, Hsiao JH, Chan AB (2018) Scanpath modeling and classification with hidden Markov models. Behav Res Methods 50:362–379

    Article  Google Scholar 

  9. Drew T, Boettcher SE, Wolfe JM (2017) One visual search, many memory searches: An eye-tracking investigation of hybrid search. J Vis 17:5–11

    Article  Google Scholar 

  10. Duchowski AT (2002) A breadth first survey of eye tracking applications. Behav Res Meth Instrum Comput 34:455–470

    Article  Google Scholar 

  11. Hansen DW, Ji Q (2010) In the eye of the beholder: A survey of models for eyes and gaze. IEEE Trans Pattern Anal Mach Intell 32:478–500

    Article  Google Scholar 

  12. Hilchey MD, Antinucci V, Lamy D, Pratt J (2019) Is attention really biased toward the last target location in visual search? Attention, response rules, distractors, and eye movements. Psychon Bull Rev 26:506–514

    Article  Google Scholar 

  13. Hooge ITC, Erkelens CJ (1996) Control of fixation duration in a simple search task. Percept Psychophys 58:969–976

    Article  Google Scholar 

  14. Hulleman J, Olivers CN (2017) The impending demise of the item in visual search. Behav Brain Sci 40:1–69

    Article  Google Scholar 

  15. Ishrat M, Abrol P (2020) Image complexity analysis with scanpath identification using remote gaze estimation model. Multimed Tools Appl 79:24393–24412

    Article  Google Scholar 

  16. Jacob RJ (1991) The use of eye movements in human-computer interaction techniques: what you look at is what you get. ACM Trans On Inf Systems(TOIS) 9:152–169

    Article  Google Scholar 

  17. Jacobs AM (1986)Eye-movement control in visual search: How direct is visual span control? Percept Psychophys 39:47–58

    Article  Google Scholar 

  18. Kaul O, Ranjanna V, Hammond T (2016) Exploring users perceived activities in a sketch based intelligent tutoring system through eye movement data. In: Proceedings of the ACM Symposium on Applied Perception, 134-139

  19. Kubler TC, Rothe C, Schiefer U, Rosenstiel W, Kasneci E (2017) SubsMatch 2.0: Scanpath comparison and classification based on subsequence frequencies. Behav Res Methods 49:1048–1064

    Article  Google Scholar 

  20. Mac Askill MR, Anderson TJ (2016) Eye movements in neurodegenerative diseases. Curr Opin Neurol 29:61–68

    Article  Google Scholar 

  21. Majaranta P, Raiha KJ, Hyrskykan A, Spakov O (2019) Eye movements and human-computer interaction. Eye Mov Res 1:97–1015 (Springer)

    Google Scholar 

  22. Modi N, Singh J (2020) A survey of research trends in assistive technologies using information modelling techniques. Assistive Technology, Disability and Rehabilitation, pp 1–19

  23. Moffitt K (1980) Evaluation of the fixation duration in visual search. Percept Psychophys 27:370–372

    Article  Google Scholar 

  24. Nagy AL, Thomas G (2003) Distractor heterogeneity, attention, and color in visual search. Vision Res 43:1541–1552

    Article  Google Scholar 

  25. Nasanen R, Ojanpaa H (2003) Effect of image contrast and sharpness on visual search for computer icons. Displays 24:137–144

    Article  Google Scholar 

  26. Neider MB, Zelinsky GJ (2006) Searching for camouflaged targets: Effects of target-background similarity on visual search. Vision Res 46:2217–2235

    Article  Google Scholar 

  27. Noton D, Stark L (1971) Scanpaths in eye movements during pattern perception. Science 171:308–3011

    Article  Google Scholar 

  28. Pomplun M, Reingold EM, Shen J (2001) Peripheral and parafoveal cueing and masking effects on saccadic selectivity in a gaze-contingent window paradigm. Vision Res 41:2757–2769

    Article  Google Scholar 

  29. Pomplun M, Garaas TW, Carrasco M (2013) The effects of task difficulty on visual search strategy in virtual 3D displays. J Vis 13:1–24

    Article  Google Scholar 

  30. Poynter W, Barber M, Inman J, Wiggins C (2013) Individuals exhibit idiosyncratic eye-movement behavior profiles across tasks. Vision Res 89:32–38

    Article  Google Scholar 

  31. Rayner K (2009) The 35th sir Frederick Bartlett lecture: Eye movements and attention in reading, scene perception, and visual search. Q J Exp Psychol 62:1457–1506

    Article  Google Scholar 

  32. Ruotolo F, Kalénine S, Bartolo A (2020) Activation of manipulation and function knowledge during visual search for objects. J Exp Psychol Hum Percept Perform 46:1–66

    Article  Google Scholar 

  33. Santis AD, Iacoviello D (2009) Robust real time eye tracking for computer interface for disabled people. Comput Methods Programs Biomed 96:1–11

    Article  Google Scholar 

  34. Schulte-Mecklenbeck M, Fiedler S, Renkewitz F, Orquin JL (2017) Reporting standards in eye-tracking research. In: Schulte-Mecklenbeck M, Kühberger A, Johnson J (eds) A handbook of process tracing methods. Routledge, New York

    Google Scholar 

  35. Scinto LF, Pillalamarri R, Karsh R (1986) Cognitive strategies for visual search. Acta Psychological 62:263–292

    Article  Google Scholar 

  36. Taltler BW, Wade NJ, Kwan H, Findlay JM, Velichkovsky BM (2010) Yarbus, eye movements and vision. i-Perception 1:7–27

    Article  Google Scholar 

  37. Vlaskamp BNS, Over EAB, Hooge ITC (2005) Saccadic search performance: The effect of element spacing. Exp Brain Res 167:246–259

    Article  Google Scholar 

  38. Wolfe JM (2020) Visual search: How do we find what we are looking for? Ann Rev Vis Sci 6:539–562

    Article  Google Scholar 

  39. Wolfe JM, Horowitz TS (2017) Five factors that guide attention in visual search. Nat Hum Behav 1:1–8

    Article  Google Scholar 

  40. Ye C, Xiong Y, Li Y, Liu L, Wang M (2020) The influences of product similarity on consumer preferences: a study based on eye- tracking analysis. Cogn Technol Work 22:603–613

    Article  Google Scholar 

  41. You CW et al (2013) Carsafe app: Alerting drowsy and distracted drivers using dual cameras on smarphones. In: Proceeding of the 11th Annual International Conference on Mobile Systems, Applications and Services, 13-26

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Mohsina Ishrat.

Ethics declarations

Conflict of interest /Competing interests

None.

Additional information

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Ishrat, M., Abrol, P. Visual search analysis using parametric fixations. Multimed Tools Appl 81, 10007–10022 (2022). https://doi.org/10.1007/s11042-022-12377-2

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11042-022-12377-2

Keywords

Navigation