Abstract
Image-recognition Human Interaction Proof (HIP) schemes are widely used security defense mechanisms that are utilized by service providers to determine whether a human user is interacting with their system and not malicious software. Inspired by recent research, which underpins the necessity for designing user-centered HIPs, this paper examines, in the frame of an accredited cognitive style theory (Field Dependence-Independence – FD-I), whether human cognitive differences in visual information processing affect users’ visual behavior when interacting with an image-recognition HIP challenge. For doing so, we conducted an eye tracking study (n = 46) in which users solved an image-recognition HIP challenge. Analysis of users’ interactions and eye gaze data revealed differences in users’ visual behavior and interactions between Holistic and Analytic users within image-recognition HIP tasks. Findings underpin the added value of considering users’ cognitive processing differences in the design of adaptive and adaptable HIP security schemes.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
von Ahn, L., Blum, M., Langford, J.: Telling humans and computers apart automatically. Commun. ACM 47, 56–60 (2004)
Chellapilla, K., Larson, K., Simard, P., Czerwinski, M., 2005. Designing human friendly human interaction proofs (HIPs). In: ACM CHI 2005, pp. 711–720. ACM (2005)
Golle, P.: Machine learning attacks against the Asirra CAPTCHA. In: ACM Conference on Computer and Communications Security (CCS 2008), pp. 535–542. ACM (2008)
Bursztein, E., Martin, M., Mitchell, J.: Text-based CAPTCHA strengths and weaknesses. In: ACM Computer and Communications Security (CCS 2011), pp. 125–138. ACM (2011)
Belk, M., Fidas, C., Germanakos, P., Samaras, G.: Do human cognitive differences in information processing affect preference and performance of captcha? J. Hum.-Comput. Stud. 84, 1–18 (2015)
reCAPTCHA. Online: https://www.google.com/recaptcha/about
Constantinides, A., Pietron, A., Belk, M., Fidas, C., Han, T., Pitsillides, A.: A cross-cultural perspective for personalizing picture passwords. In: ACM User Modeling, Adaptation and Personalization (UMAP 2020), pp. 43–52. ACM (2020)
Constantinides, A., Fidas, C., Belk, M., Pietron, A.M., Han, T., Pitsillides, A.: From hot-spots towards experience-spots: leveraging on users’ sociocultural experiences to enhance security in cued-recall graphical authentication. Int. J. Hum.-Comput. Stud., 149 (2021). https://doi.org/10.1016/j.ijhcs.2021.102602
Davidoff, J., Fonteneau, E., Fagot, J.: Local and global processing: observations from a remote culture. Cognition 108(3), 702–709 (2008)
Witkin, H.A., Moore, C.A., Goodenough, D.R., Cox, P.W.: Field–dependent and field–independent cognitive styles and their educational implications. ETS Res. Bull. Series 2, 1–64 (1975)
Belk, M., Fidas, C., Katsini, C., Avouris, N., Samaras, G.: Effects of human cognitive differences on interaction and visual behavior in graphical user authentication. In: Bernhaupt, R., Dalvi, G., Joshi, A., K. Balkrishan, D., O’Neill, J., Winckler, M. (eds.) INTERACT 2017. LNCS, vol. 10515, pp. 287–296. Springer, Cham (2017). https://doi.org/10.1007/978-3-319-67687-6_19
Hong, J., Hwang, M., Tam, K., Lai, Y., Liu, L.: Effects of cognitive style on digital jigsaw puzzle performance: a GridWare analysis. Comput. Hum. Behav. 28(3), 920–928 (2012)
Rittschof, K.A.: Field dependence-independence as visuospatial and executive functioning in working memory: implications for instructional systems design and research. Educ. Technol. Res. Dev. 58(1), 99–114 (2010)
Angeli, C., Valanides, N., Kirschner, P.: Field dependence-independence and instructional-design effects on learners’ performance with a computer-modeling tool. Comput. Hum. Behav. 25(6), 1355–1366 (2009)
Belk, M., Fidas, C., Germanakos, P., Samaras, G.: The interplay between humans, technology and user authentication: a cognitive processing perspective. Comput. Hum. Behav. 76, 184–200 (2017)
GP3 Eye Tracker. Online. https://www.gazept.com
Constantinides, A., Belk, M., Fidas, C., Pitsillides, A.: On the accuracy of eye gaze-driven classifiers for predicting image content familiarity in graphical passwords. In: ACM UMAP 2019, pp. 201–205. ACM (2019)
Witkin, H.A., Oltman, P., Raskin, E., Karp, S.: A Manual for the Embedded Figures Test. Consulting Psychologists Press, Palo Alto, CA (1971)
Alt, F., Schneegass, S., Shirazi, A.S., Hassib, M., Bulling, A.: Graphical passwords in the wild: understanding how users choose pictures and passwords in image-based authentication schemes. In: ACM MobileHCI 2015, pp. 316–322 (2015)
Dunphy, P., Yan, J.: Do background images improve draw a secret graphical passwords? In: ACM Computer and Communications Security, pp. 36–47. ACM (2007)
Zhao, Z., Ahn, G., Hu, H.: Picture Gesture Authentication: Empirical Analysis, Automated Attacks, and Scheme Evaluation. Journal of ACM Transactions on Information and System Security (TISSEC) 17, 4, Article 14, 37 pages (2015)
Wiedenbeck, S., Waters, J., Birget, J.C., Brodskiy, A., Memon, N.: Authentication using graphical passwords: Effects of tolerance and image choice. In: ACM Symposium on Usable privacy and security, pp. 1–12. ACM (2005)
Katsini, C., Fidas, C., Raptis, G. E., Belk, M., Samaras, G., Avouris, N.: Influences of human cognition and visual behavior on password strength during picture password composition. In: CHI 2018, p. 87. ACM (2018)
Fidas, C., Voyiatzis, A., Avouris, N.: On the necessity of user-friendly CAPTCHA. In: ACM CHI 2011, pp. 2623–2626. ACM (2011)
Belk, M., Germanakos, P., Fidas, C., Holzinger, A., Samaras, G.: Towards the personalization of CAPTCHA mechanisms based on individual differences in cognitive processing. Springer Human Factors in Computing and Informatics (SouthCHI 2013), Springer-Verlag, pp. 409–426 (2013)
Belk, M., Fidas, C., Germanakos, P., Samaras, G.: Do cognitive styles of users affect preference and performance related to CAPTCHA challenges? In: CHI 2012 Extended Abstracts on Human Factors in Computing Systems (CHI EA 2012), pp. 1487–1492. ACM (2012)
Elson, J., Douceur, J., Howell, J., Saul, J.: Asirra: a CAPTCHA that Exploits interest-aligned manual image categorization. In: Proceedings of the International Conference on Computer and Communications Security (CCS 2007), pp. 366–374. ACM (2007)
Belk, M., Germanakos, P., Fidas, C., Spanoudis, G., Samaras, G.: Studying the Effect of Human Cognition on Text and Image Recognition CAPTCHA Mechanisms. HCI 27, 71–79 (2013)
Vikram, S., Fan, Y., Gu, G.: SEMAGE: a new image-based two-factor CAPTCHA. In: ACM Conference on Computer Security Applications (CCS 2011), pp. 237–246. ACM (2011)
Fidas, C., Hussmann, H., Belk, M., Samaras, G.: IHIP: towards a user centric individual human interaction proof framework. In: CHI ‘15 Extended Abstracts on Human Factors in Computing Systems (CHI EA ‘15), pp. 2235–2240. ACM (2015)
Gossweiler, R., Kamvar, M., Baluja, S.: What’s up CAPTCHA?: a CAPTCHA based on image orientation. In: ACM World Wide Web (WWW 2009), pp. 841–850. ACM (2009)
Tanthavech, N., Nimkoompai, A.: CAPTCHA: Impact of website security on user experience. In: Proceedings of the 2019 4th International Conference on Intelligent Information Technology (ICIIT 2019), pp. 37–41. ACM (2019)
Sim, T., Nejati, H., Chua, J.: Face recognition CAPTCHA made difficult. In: Proceedings of the 23rd International Conference on World Wide Web (WWW 2014 Companion), pp. 379–380. ACM (2014)
Shishkin, A., Bezzubtseva, A., Fedorova, V., Drutsa, A., Gusev, G.: Text recognition using anonymous CAPTCHA answers. In: ACM Web Search and Data Mining (WSDM 2020), pp. 537–545. ACM (2020)
Lazar, J., et al.: The SoundsRight CAPTCHA: an improved approach to audio human interaction proofs for blind users. In: ACM Conference on Human Factors in Computing Systems (CHI 2012), pp. 2267–2276. ACM (2012)
Jiang, N., Tian, F.: A novel gesture-based CAPTCHA design for smart devices. In: BCS Human Computer Interaction Conference (BCS-HCI ‘13). BCS Learning & Development Ltd., Swindon, GBR, Article 49, pp. 1–5 (2013)
Acknowledgements
This research has been partially funded by the EU Horizon 2020 Grant 826278 “Securing Medical Data in Smart Patient-Centric Healthcare Systems” (Serums), and the Research and Innovation Foundation (Project DiversePass: COMPLEMENTARY/0916/0182).
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2021 Springer Nature Switzerland AG
About this paper
Cite this paper
Leonidou, P., Constantinides, A., Belk, M., Fidas, C., Pitsillides, A. (2021). Eye Gaze and Interaction Differences of Holistic Versus Analytic Users in Image-Recognition Human Interaction Proof Schemes. In: Moallem, A. (eds) HCI for Cybersecurity, Privacy and Trust. HCII 2021. Lecture Notes in Computer Science(), vol 12788. Springer, Cham. https://doi.org/10.1007/978-3-030-77392-2_5
Download citation
DOI: https://doi.org/10.1007/978-3-030-77392-2_5
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-77391-5
Online ISBN: 978-3-030-77392-2
eBook Packages: Computer ScienceComputer Science (R0)