skip to main content
10.1145/3030024.3038266acmconferencesArticle/Chapter ViewAbstractPublication PagesiuiConference Proceedingsconference-collections
poster

EyamKayo: Interactive Gaze and Facial Expression Captcha

Authors Info & Claims
Published:07 March 2017Publication History

ABSTRACT

This paper introduces {it EyamKayo}, a first-of-its-kind interactive CAPTCHA (Completely Automated Public Turing test to tell Computers and Humans Apart), using eye gaze and facial expression based human interactions, to better distinguish humans from software robots. Our system generates a sequence of instructions, asking the user to follow a controlled sequence of gaze points, and generate a controlled sequence of facial expressions. We evaluate user comfort and system usability, and validate using usability tests.

References

  1. Tadas Baltru, Peter Robinson, Louis-Philippe Morency, and others. 2016. OpenFace: an open source facial behavior analysis toolkit. In WACV. IEEE, 1--10.Google ScholarGoogle Scholar
  2. John Brooke. 1996. SUS-A quick and dirty usability scale. Usability eval. in industry 189, 194 (1996), 4--7.Google ScholarGoogle Scholar
  3. Maria De Marsico, Luca Marchionni, Andrea Novelli, and Michael Oertel. 2016. FATCHA: biometrics lends tools for CAPTCHAs. Multimedia Tools and Applications (2016), 1--24.Google ScholarGoogle Scholar
  4. Rahul Islam, Karan Ahuja, Sandip Karmakar, and Ferdous Barbhuiya. 2016. SenTion: A framework for Sensing Facial Expressions. arXiv preprint arXiv:1608.04489 (2016).Google ScholarGoogle Scholar
  5. Mohamed Khamis, Florian Alt, and Andreas Bulling. 2015. A field study on spontaneous gaze-based interaction with a public display using pursuits. In ISWC (UbiComp). ACM, 863--872. Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. Carlos H Morimoto and Marcio RM Mimica. 2005. Eye gaze tracking techniques for interactive applications. Comp. Vis. and Img. Understanding 98, 1 (2005), 4--24. Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. Suphannee Sivakorn, Iasonas Polakis, and Angelos D Keromytis. 2016. I am robot:(deep) learning to break semantic image captchas. In EuroS&P. IEEE, 388--403.Google ScholarGoogle Scholar

Index Terms

  1. EyamKayo: Interactive Gaze and Facial Expression Captcha

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in
    • Published in

      cover image ACM Conferences
      IUI '17 Companion: Companion Proceedings of the 22nd International Conference on Intelligent User Interfaces
      March 2017
      246 pages
      ISBN:9781450348935
      DOI:10.1145/3030024

      Copyright © 2017 Owner/Author

      Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 7 March 2017

      Check for updates

      Qualifiers

      • poster

      Acceptance Rates

      IUI '17 Companion Paper Acceptance Rate63of272submissions,23%Overall Acceptance Rate746of2,811submissions,27%

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader