Abstract
Eye movement perimetry (EMP) is a paradigm developed to assess the visual field without the necessity of suppressing the natural eye movements during the test. Unlike the standard automated perimetry (SAP) where the patient’s responses are recorded using a button, EMP uses the natural eye movements reflex as responses during the evaluation. The reliability of EMP depends on correctly determining whether a stimulus is seen or not which, in turn, depends on an adequate analysis of the eye movement data. However, many studies in EMP have focused on characterizing eye movements and only a few authors have documented their methods to determine whether a peripheral stimulus was seen during the test. Furthermore, many of them use static thresholds to perform the classification, but it is not clear how these threshold values were obtained. Based on the foregoing, we develop a threshold test based on FASTPAC C24-2 and EMP for the visual field assessment. Our method uses two machine learning techniques: (1) cascaded K-Means and Bayesian classifiers (KBC) and (2) an Artificial Neural Network (ANN) to classify whether a stimulus was seen or not. Our method was validated with twenty healthy participants (13 women and 7 men) aged 19–43 years (µ = 26 ± 5 years), where the participants performed both an EMP test and an SAP emulation test. Results were compared with gaze trajectories annotations performed by an expert, obtaining accuracy values between 96.8% and 98.9% for KBC and ANN, and values between 90.5% and 92% for SAP emulation.
![](http://media.springernature.com/m312/springer-static/image/art%3A10.1007%2Fs11042-023-14464-4/MediaObjects/11042_2023_14464_Fig1_HTML.png)
![](http://media.springernature.com/m312/springer-static/image/art%3A10.1007%2Fs11042-023-14464-4/MediaObjects/11042_2023_14464_Fig2_HTML.png)
![](http://media.springernature.com/m312/springer-static/image/art%3A10.1007%2Fs11042-023-14464-4/MediaObjects/11042_2023_14464_Fig3_HTML.png)
![](http://media.springernature.com/m312/springer-static/image/art%3A10.1007%2Fs11042-023-14464-4/MediaObjects/11042_2023_14464_Fig4_HTML.png)
![](http://media.springernature.com/m312/springer-static/image/art%3A10.1007%2Fs11042-023-14464-4/MediaObjects/11042_2023_14464_Fig5_HTML.png)
![](http://media.springernature.com/m312/springer-static/image/art%3A10.1007%2Fs11042-023-14464-4/MediaObjects/11042_2023_14464_Fig6_HTML.png)
![](http://media.springernature.com/m312/springer-static/image/art%3A10.1007%2Fs11042-023-14464-4/MediaObjects/11042_2023_14464_Fig7_HTML.png)
![](http://media.springernature.com/m312/springer-static/image/art%3A10.1007%2Fs11042-023-14464-4/MediaObjects/11042_2023_14464_Fig8_HTML.png)
![](http://media.springernature.com/m312/springer-static/image/art%3A10.1007%2Fs11042-023-14464-4/MediaObjects/11042_2023_14464_Fig9_HTML.png)
![](http://media.springernature.com/m312/springer-static/image/art%3A10.1007%2Fs11042-023-14464-4/MediaObjects/11042_2023_14464_Fig10_HTML.png)
![](http://media.springernature.com/m312/springer-static/image/art%3A10.1007%2Fs11042-023-14464-4/MediaObjects/11042_2023_14464_Fig11_HTML.png)
![](http://media.springernature.com/m312/springer-static/image/art%3A10.1007%2Fs11042-023-14464-4/MediaObjects/11042_2023_14464_Fig12_HTML.png)
![](http://media.springernature.com/m312/springer-static/image/art%3A10.1007%2Fs11042-023-14464-4/MediaObjects/11042_2023_14464_Fig13_HTML.png)
![](http://media.springernature.com/m312/springer-static/image/art%3A10.1007%2Fs11042-023-14464-4/MediaObjects/11042_2023_14464_Fig14_HTML.png)
![](http://media.springernature.com/m312/springer-static/image/art%3A10.1007%2Fs11042-023-14464-4/MediaObjects/11042_2023_14464_Fig15_HTML.png)
Similar content being viewed by others
Data Availability
Not applicable.
Code Availability
No applicable.
References
Anders H, Patella VM, Chong LX et al (2019) A New SITA perimetric threshold testing algorithm: construction and a multicenter clinical study. Am J Ophthalmol 198:154–165. https://doi.org/10.1016/j.ajo.2018.10.010
Bishop CM (2006) Pattern recognition and machine learning. Springer- Verlag, Berlin
Demirel S, Vingrys AJ (1994) Eye movements during perimetry and the effect that fixational instability has on perimetric outcomes. J Glaucoma 3(1):28–35 PMID: 19920549
Duda RO, Hart PE, Stork DG (2000) Pattern classification, 2nd edn. Wiley-Interscience, New York
Jernigan ME (1980) Structural analysis of eye movement responses to visual field stimuli. Comput Biol Med 10:11–22. https://doi.org/10.1016/0010-4825(80)90003-7
Johnson CA, Adams CW, Lewis RA (1988) Fatigue effects in automated perimetry. Appl Opt 27(6):1030–1037. https://doi.org/10.1364/AO.27.001030
Jones PR (2020) An open-source static threshold perimetry test using remote eye-tracking (eyecatcher): description, validation and preliminary normative data. Trans Vis Sci Tech 9(8):18. https://doi.org/10.1167/tvst.9.8.18
Junoy Montolio FG, Wesselink C, Gordijn M, Jansonius NM (2012) Factors that influence standard automated perimetry test results in glaucoma: test reliability, technician experience, time of day, and season. Invest Ophthalmol Vis Sci 53(11):7010–7017. https://doi.org/10.1167/iovs.12-10268
Kim DE, Eizenman M, Trope GE, Kranemann C (1995) Eye movement perimetry. In: Engineering in Medicine and Biology Society, Montreal, Quebec, Canada. IEEE, pp 629–1630
Kosnik W, Fikre J, Sekuler R (1986) Visual fixation stability in older adults. Invest Ophthalmol Vis Sci 27(12):1720–1725
Martínez-González EA, Alba A, Méndez MO et al (2020) Developing a visual perimetry test based on eye-tracking: proof of concept. Health Technol 10:437–441. https://doi.org/10.1007/s12553-019-00366-9
Mazumdar D, Kadavath Meethal NS, Panday M, Asokan R, Thepass G, George RJ, van der Steen J, Pel JJM (2019) Effect of age, sex, stimulus intensity, and eccentricity on saccadic reaction time in eye movement perimetry. Trans Vis Sci Tech 8(4):13. https://doi.org/10.1167/tvst.8.4.13
McTrusty AD, Cameron LA, Perperidis A et al (2017) Comparison of threshold Saccadic Vector Optokinetic Perimetry (SVOP) and standard automated perimetry (SAP) in Glaucoma. Part II: patterns of Visual Field loss and acceptability. Trans Vis Sci Tech 6(5):4. https://doi.org/10.1167/tvst.6.5.4
Murray IC, Fleck BW, Brash HM, MacRae ME, Tan LL, Minns RA (2009) Feasibility of saccadic vector optokinetic perimetry: a method of automated static perimetry for children using eye tracking. Ophthalmology 116:2017–2026
Murray IC, Cameron LA, McTrusty AD et al (2016) Feasibility, accuracy, and repeatability of suprathreshold saccadic vector optokinetic perimetry. Trans Vis Sci Tech 5(4):15. https://doi.org/10.1167/tvst.5.4.15
Murray IC, Perperidis A, Cameron LA et al (2017) Comparison of saccadic vector optokinetic perimetry and standard automated perimetry in glaucoma. Part I: threshold values and repeatability. Trans Vis Sci Tech 6(5):3. https://doi.org/10.1167/tvst.6.5.3
Pel JJM, van Beijsterveld MCM, Thepass G, van der Steen J (2013) Validity and repeatability of saccadic response times across the visual field in eye movement perimetry. Trans Vis Sci Tech 2(7):3. https://doi.org/10.1167/tvst.2.7.3
Pete R, Jones ND, Smith W, Bi DP (2019) Crabb; portable perimetry using eye-tracking on a tablet computer—a feasibility assessment. Trans Vis Sci Tech 8(1):17. https://doi.org/10.1167/tvst.8.1.17
Acknowledgements
E.A. Martínez-González acknowledges CONACYT- México for the scholarship 712805.
Author information
Authors and Affiliations
Contributions
Not applicable.
Corresponding author
Ethics declarations
Ethics approval
The study presented in this article was conducted following the ethical standards of the institution (Universidad Autónoma de San Luis Potosí) and with the 1964 Helsinki declaration and its later amendments or comparable ethical standards.
Consent to participate
Signed informed consent was obtained from all the participants included in the study.
Consent for publication
Maintaining the privacy of the participants and the data confidentiality, all the participants gave their consent to use their data acquired during the experiment in this research.
Conflicts of interest/Competing interests
The authors have no conflicts of interest to declare that are relevant to the content of this article.
Additional information
Publisher’s note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.
About this article
Cite this article
Martínez-González, E.A., Alba, A., Arce-Santana, E. et al. A novel system for the automatic reconstruction of visual field based on eye tracking and machine learning. Multimed Tools Appl 82, 27193–27215 (2023). https://doi.org/10.1007/s11042-023-14464-4
Received:
Revised:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11042-023-14464-4