ABSTRACT
This paper introduces {it EyamKayo}, a first-of-its-kind interactive CAPTCHA (Completely Automated Public Turing test to tell Computers and Humans Apart), using eye gaze and facial expression based human interactions, to better distinguish humans from software robots. Our system generates a sequence of instructions, asking the user to follow a controlled sequence of gaze points, and generate a controlled sequence of facial expressions. We evaluate user comfort and system usability, and validate using usability tests.
- Tadas Baltru, Peter Robinson, Louis-Philippe Morency, and others. 2016. OpenFace: an open source facial behavior analysis toolkit. In WACV. IEEE, 1--10.Google Scholar
- John Brooke. 1996. SUS-A quick and dirty usability scale. Usability eval. in industry 189, 194 (1996), 4--7.Google Scholar
- Maria De Marsico, Luca Marchionni, Andrea Novelli, and Michael Oertel. 2016. FATCHA: biometrics lends tools for CAPTCHAs. Multimedia Tools and Applications (2016), 1--24.Google Scholar
- Rahul Islam, Karan Ahuja, Sandip Karmakar, and Ferdous Barbhuiya. 2016. SenTion: A framework for Sensing Facial Expressions. arXiv preprint arXiv:1608.04489 (2016).Google Scholar
- Mohamed Khamis, Florian Alt, and Andreas Bulling. 2015. A field study on spontaneous gaze-based interaction with a public display using pursuits. In ISWC (UbiComp). ACM, 863--872. Google ScholarDigital Library
- Carlos H Morimoto and Marcio RM Mimica. 2005. Eye gaze tracking techniques for interactive applications. Comp. Vis. and Img. Understanding 98, 1 (2005), 4--24. Google ScholarDigital Library
- Suphannee Sivakorn, Iasonas Polakis, and Angelos D Keromytis. 2016. I am robot:(deep) learning to break semantic image captchas. In EuroS&P. IEEE, 388--403.Google Scholar
Index Terms
- EyamKayo: Interactive Gaze and Facial Expression Captcha
Recommendations
Emotional facial expression interface: effects of displayed facial designs
Ergo'IA '10: Proceedings of the Ergonomie et Informatique Avancee ConferenceIn supporting the visual-mediated emotional recognition, little research has centred on analysing the effect of presenting different combinatorial facial designs. Moreover, in the theoretical field of emotions, research on the recognition of emotional ...
Emotion recognition using facial expressions with active appearance models
HCI '08: Proceedings of the Third IASTED International Conference on Human Computer InteractionRecognizing emotion using facial expressions is a key element in human communication. In this paper we discuss a framework for the classification of emotional states, based on still images of the face. The technique we present involves the creation of ...
How to Make a Robot Smile? Perception of Emotional Expressions from Digitally-Extracted Facial Landmark Configurations
Social RoboticsAbstractTo design robots or embodied conversational agents that can accurately display facial expressions indicating an emotional state, we need technology to produce those facial expressions, and research that investigates the relationship between those ...
Comments