Abstract
Eye movement analysis has been an important area of research. The analysis of eye movements could help in better understanding of human preferences, behavioral pattern and viewing sequences. Eye movement analysis involves scanpath tracking which is the path traversed by movement of eyes. Scanpath comprises of two major components that are fixations and saccades. Fixations are the common events studied for understanding behavioral tendencies whereas saccades have generally been referred as corrective random eye movements that cannot contribute to human behavioral understanding. The constant viewing of a region or an object in a visual scene for a certain period of time is termed as fixation that has been generally analyzed in terms of associated parameters. The most common parameters analyzed include, number of fixations and duration of fixations. The analysis of fixations has been a part of many research areas that include scene perception, visual search, marketing, diagnostic and interactive applications. Visual search has been a prominent research area wherein a given target objects is to be found amidst group of distracters in a displayed scene/image. The distractions could be due to presence of similar looking objects, low variation of color or higher number of heterogeneous objects. In this paper, fixations have been analyzed in terms of different fixation parameters during experiments of visual search. A total of three experiments of visual search have been conducted. The experiments have been conducted in a sequence, for understanding the impact of distractions during visual search on parameters of fixations and thereby eye movements. The identified parameters include ‘number of fixations, ‘total fixation duration’, ‘maximum fixation duration’ and ‘total search time’. A total of three distractions present in images i.e. low chromatic variation, presence of multiple heterogeneous objects and high target- distracter similarity have been analyzed. Eye movement data of forty one subjects has been captured using a remote eye tracking setup. It has been found that visual images with high target and non target object similarity have higher impact on fixation parameters and thus eye movements. The distractions in the form of low chromatic variations and presence of multiple heterogenic objects in images have almost similar impact on eye movements. The results indicate that searching in images with high similarity between target and non target objects has been difficult for subjects in comparison to images with high heterogenic component and low chromatic variation. Also, result of parameter (maximum fixation duration) shows different results on images that have been easy for subjects to search, in comparison to difficult visual search. The obtained results could lead to better development and optimization of user oriented applications using remote eye tracking systems.
Similar content being viewed by others
References
Aksum KM, Magnaguagno L, Bjorndal CT, Jordet G (2020) What do football players look at? An eye-tracking analysis of the visual fixations of players in 11 v 11 elite football match play. Front Psychol 11:1–11
Alexander RG, Zelinsky GJ (2012) Effects of part-based similarity on visual search: The Frankenbear experiment. Vis Res 54:20–30
Boettcher S, Wolfe JM (2015) Searching for the right word: Hybrid visual and memory search for words. Atten Percept Psycho 77:1132–1142
Brown AM, Lindsey DT, Guckes KM (2011) Color names, color categories, and color-cued visual search: Sometimes, color perception is not categorical. J Vis 11:2–12
Brunye TT, Drew T, Weaver DL, Elmore JG (2019) A review of eye tracking for understanding and improving diagnostic interpretation. Cogn Res: Princ Implic 4:1–16
Bulling A, Ward JA, Gellersen H, Troster G (2010) Eye movement analysis for activity recognition using electrooculography. IEEE Trans Pattern Anal Mach Intell 33:741–753
Cho J, Chong SC (2019) Search termination when the target is absent: The prevalence of coarse processing and its intertrial influence. J Exp Psychol Hum Percept Perform 45:1443–1455
Coutrot A, Hsiao JH, Chan AB (2018) Scanpath modeling and classification with hidden Markov models. Behav Res Methods 50:362–379
Drew T, Boettcher SE, Wolfe JM (2017) One visual search, many memory searches: An eye-tracking investigation of hybrid search. J Vis 17:5–11
Duchowski AT (2002) A breadth first survey of eye tracking applications. Behav Res Meth Instrum Comput 34:455–470
Hansen DW, Ji Q (2010) In the eye of the beholder: A survey of models for eyes and gaze. IEEE Trans Pattern Anal Mach Intell 32:478–500
Hilchey MD, Antinucci V, Lamy D, Pratt J (2019) Is attention really biased toward the last target location in visual search? Attention, response rules, distractors, and eye movements. Psychon Bull Rev 26:506–514
Hooge ITC, Erkelens CJ (1996) Control of fixation duration in a simple search task. Percept Psychophys 58:969–976
Hulleman J, Olivers CN (2017) The impending demise of the item in visual search. Behav Brain Sci 40:1–69
Ishrat M, Abrol P (2020) Image complexity analysis with scanpath identification using remote gaze estimation model. Multimed Tools Appl 79:24393–24412
Jacob RJ (1991) The use of eye movements in human-computer interaction techniques: what you look at is what you get. ACM Trans On Inf Systems(TOIS) 9:152–169
Jacobs AM (1986)Eye-movement control in visual search: How direct is visual span control? Percept Psychophys 39:47–58
Kaul O, Ranjanna V, Hammond T (2016) Exploring users perceived activities in a sketch based intelligent tutoring system through eye movement data. In: Proceedings of the ACM Symposium on Applied Perception, 134-139
Kubler TC, Rothe C, Schiefer U, Rosenstiel W, Kasneci E (2017) SubsMatch 2.0: Scanpath comparison and classification based on subsequence frequencies. Behav Res Methods 49:1048–1064
Mac Askill MR, Anderson TJ (2016) Eye movements in neurodegenerative diseases. Curr Opin Neurol 29:61–68
Majaranta P, Raiha KJ, Hyrskykan A, Spakov O (2019) Eye movements and human-computer interaction. Eye Mov Res 1:97–1015 (Springer)
Modi N, Singh J (2020) A survey of research trends in assistive technologies using information modelling techniques. Assistive Technology, Disability and Rehabilitation, pp 1–19
Moffitt K (1980) Evaluation of the fixation duration in visual search. Percept Psychophys 27:370–372
Nagy AL, Thomas G (2003) Distractor heterogeneity, attention, and color in visual search. Vision Res 43:1541–1552
Nasanen R, Ojanpaa H (2003) Effect of image contrast and sharpness on visual search for computer icons. Displays 24:137–144
Neider MB, Zelinsky GJ (2006) Searching for camouflaged targets: Effects of target-background similarity on visual search. Vision Res 46:2217–2235
Noton D, Stark L (1971) Scanpaths in eye movements during pattern perception. Science 171:308–3011
Pomplun M, Reingold EM, Shen J (2001) Peripheral and parafoveal cueing and masking effects on saccadic selectivity in a gaze-contingent window paradigm. Vision Res 41:2757–2769
Pomplun M, Garaas TW, Carrasco M (2013) The effects of task difficulty on visual search strategy in virtual 3D displays. J Vis 13:1–24
Poynter W, Barber M, Inman J, Wiggins C (2013) Individuals exhibit idiosyncratic eye-movement behavior profiles across tasks. Vision Res 89:32–38
Rayner K (2009) The 35th sir Frederick Bartlett lecture: Eye movements and attention in reading, scene perception, and visual search. Q J Exp Psychol 62:1457–1506
Ruotolo F, Kalénine S, Bartolo A (2020) Activation of manipulation and function knowledge during visual search for objects. J Exp Psychol Hum Percept Perform 46:1–66
Santis AD, Iacoviello D (2009) Robust real time eye tracking for computer interface for disabled people. Comput Methods Programs Biomed 96:1–11
Schulte-Mecklenbeck M, Fiedler S, Renkewitz F, Orquin JL (2017) Reporting standards in eye-tracking research. In: Schulte-Mecklenbeck M, Kühberger A, Johnson J (eds) A handbook of process tracing methods. Routledge, New York
Scinto LF, Pillalamarri R, Karsh R (1986) Cognitive strategies for visual search. Acta Psychological 62:263–292
Taltler BW, Wade NJ, Kwan H, Findlay JM, Velichkovsky BM (2010) Yarbus, eye movements and vision. i-Perception 1:7–27
Vlaskamp BNS, Over EAB, Hooge ITC (2005) Saccadic search performance: The effect of element spacing. Exp Brain Res 167:246–259
Wolfe JM (2020) Visual search: How do we find what we are looking for? Ann Rev Vis Sci 6:539–562
Wolfe JM, Horowitz TS (2017) Five factors that guide attention in visual search. Nat Hum Behav 1:1–8
Ye C, Xiong Y, Li Y, Liu L, Wang M (2020) The influences of product similarity on consumer preferences: a study based on eye- tracking analysis. Cogn Technol Work 22:603–613
You CW et al (2013) Carsafe app: Alerting drowsy and distracted drivers using dual cameras on smarphones. In: Proceeding of the 11th Annual International Conference on Mobile Systems, Applications and Services, 13-26
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Conflict of interest /Competing interests
None.
Additional information
Publisher’s note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
About this article
Cite this article
Ishrat, M., Abrol, P. Visual search analysis using parametric fixations. Multimed Tools Appl 81, 10007–10022 (2022). https://doi.org/10.1007/s11042-022-12377-2
Received:
Revised:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11042-022-12377-2