skip to main content
10.1145/3206343.3206352acmconferencesArticle/Chapter ViewAbstractPublication PagesetraConference Proceedingsconference-collections
short-paper

Moveye: gaze control of video playback

Published: 15 June 2018 Publication History

Abstract

Several methods of gaze control of video playback were implemented in MovEye application. Two versions of MovEye are almost ready: for watching online movies from the YouTube service and for watching movies from the files stored on local drives. We have two goals: the social one is to help people with physical disabilities to control and enrich their immediate environment; the scientific one is to compare the usability of several gaze control methods for video playback in case of healthy and disabled users. This paper aims to our gaze control applications. Our next step will be conducting the accessibility and user experience (UX) tests for both healthy and disabled users. The long-time perspective of this research could lead to the implementation of gaze control in TV sets and other video playback devices.

References

[1]
World Population Ageing. 2002. World Population Ageing 1950-2050. Population English Edition (2002).
[2]
Serkan Alkan and Kursat Cagiltay. 2007. Studying computer game learning experience through eye tracking: Colloquium.
[3]
Samuel Almeida, Ana Veloso, Licinio Roque, and Oscar Mealha. 2011. The Eyes and Games : A Survey of Visual Attention and Eye Tracking Input in Video Games. Proceedings of SBGames (2011).
[4]
Arisu An, Hae Duck J Jeong, Jiyoung Lim, and Wooseok Hyun. 2012. Design and implementation of location-based sns smartphone application for the disabled population. In Proceedings - 6th International Conference on Innovative Mobile and Internet Services in Ubiquitous Computing, IMIS 2012.
[5]
Hirotaka Aoki, John Paulin Hansen, and Kenji Itoh. 2008. Learning to interact with a computer by gaze. In Behaviour and Information Technology.
[6]
Nikolaus Bee and Elisabeth André. 2008. Writing with your eye: A dwell time free writing system adapted to the nature of human eye gaze. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics).
[7]
DA Craig and HT Nguyen. 2005. Wireless real-time head movement system using a personal digital assistant (PDA) for control of a power wheelchair. Conference Proceedings: ... Annual International Conference Of The IEEE Engineering In Medicine And Biology Society. IEEE Engineering In Medicine And Biology Society. Conference {Conf Proc IEEE Eng Med Biol Soc} 2005 (2005).
[8]
Chris Creed. 2016. Assistive tools for disability arts: collaborative experiences in working with disabled artists and stakeholders. Journal of Assistive Technologies (2016).
[9]
Mick Donegan, Jeffrey D. Morris, Fulvio Corno, Isabella Signorile, Adriano Chó, Valentina Pasian, Alessandro Vignola, Margret Buchholz, and Eva Holmqvist. 2009. Understanding users and their needs. In Universal Access in the Information Society.
[10]
Joanna Dreszer, Bibianna Bałaj, Jacek Matulewski, Monika Lewandowska, Łukasz Goraczewski, and Włodzisław Duch. 2015. Climate of an Earth-like Aquaplanet: the high-obliquity case and the tidally-locked case. In XVIII European Conference on Eye Movements (ECEM 2015). Vienna.
[11]
Katie Ellis. 2011. Embracing learners with disability: Web 2.0, access and insight. Telecommunications Journal of Australia (2011).
[12]
Katie Ellis and Mike Kent. 2008. iTunes Is Pretty (Useless) When You're Blind: Digital Design Is Triggering Disability When It Could Be a Solution. M/C Journal (2008).
[13]
Adriano Galante and Paulo Menezes. 2012. A Gaze-Based Interaction System for People with Cerebral Palsy. Procedia Technology (2012).
[14]
James Gips and Peter Olivieri. 1996. EagleEyes: An eye control system for persons with disabilities. In The Eleventh International Conference on Technology and Persons with Disabilities. 1--15.
[15]
Dan Witzner Hansen, Henrik H. T. Skovsgaard, John Paulin Hansen, and Emilie Møllenbach. 2008. Noise tolerant selection by gaze-controlled pan and zoom in 3D. In Proceedings of the 2008 symposium on Eye tracking research & applications - ETRA '08.
[16]
John Paulin Hansen, Dan Witzner Hansen, and Anders Sewerin Johansen. 2001. Bringing Gaze-based Interaction Back to Basics. Universal Access In HCI (2001).
[17]
Ken Harrenstien. 2009. Automatic Captions in YouTube. The Official Google Blog. http://googleblog.blogspot.com/2009/11/automatic-captions-in-youtube. html?utm_source=. (Accessed on 04/19/2018).
[18]
Peter A. Howarth Howell Istance, Christian Spinner. 2006. Eye-based Control of Standard GUI Software. In People and Computers XI: Proceedings of HCI'96. 78--81.
[19]
Anke Huckauf and Mario H. Urbina. 2008. Gazing with pEYEs. In Proceedings of the 2008 symposium on Eye tracking research & applications - ETRA '08.
[20]
Statistic Brain Research Institute. 2017a. Average daily television viewing time in Germany from 1997 to 2016 in minutes. https://www.statisticbrain.com/television-watching-statistics/. (Accessed on 04/19/2018).
[21]
Statistic Brain Research Institute. 2017b. YouTube company statistics. https://www.statisticbrain.com/youtube-statistics/. (Accessed on 04/19/2018).
[22]
Poika Isokoski and Benoit Martin. 2007. Performance of input devices in FPS target acquisition. Proceedings of the international conference on Advances in computer entertainment technology (2007).
[23]
Robert J. K. Jacob. 1991. The use of eye movements in human-computer interaction techniques: what you look at is what you get. ACM Transactions on Information Systems (1991).
[24]
Robert J K Jacob. 1993. Eye Movement-Based Human-Computer Interaction Techniques : Toward Non-Command Interfaces. Advances in human-computer interaction (1993).
[25]
Kevin Juang, Frank Jasen, Akshay Katrekar, Joe Ahn, and Andrew T. Duchowski. {n. d.}. Use of Eye Movement Gestures for Web Browsing. https://pdfs.semanticscholar.org/a45f/18b38500192120f9d185daad24a0d1842e90.pdf. (Accessed on 04/19/2018).
[26]
Szymon Murawski Julita Zawadzka. 2015. Niech przemówią Twoje oczy. (in Polish).
[27]
Chern Sheng Lin, Chia Chin Huan, Chao Ning Chan, Mau Shiun Yeh, and Chuang Chien Chiu. 2004. Design of a computer game using an eye-tracking device for eye's activity rehabilitation. Optics and Lasers in Engineering (2004).
[28]
Rafał Linowiecki, Jacek Matulewski, Bibianna Bałaj, Agnieszka Ignaczewska, Joanna Dreszer, Magdalena Kmiecik, and Włodzisław Duch. 2017. Platforma tworzenia aplikacji kontrolowanych wzrokiem - nowy sposób przygotowywania w pełni interaktywnych eksperymentów z użyciem okulografu. Lingwistyka Stosowana / Applied Linguistics / Angewandte Linguistik 20 (2017), 83--99. (in Polish).
[29]
Päivi Majaranta, Hirotaka Aoki, Mick Donegan, Dan Witzner Hansen, John Paulin Hansen, Aulikki Hyrskykari, and Kari-Jouko Räihä. 2012. Gaze Interaction and Applications of Eye Tracking.
[30]
Päivi Majaranta and Kari-Jouko Räihä. 2002. Twenty Years of Eye Typing: Systems and Design Issues. Eye Tracking Research & Applications (ETRA) Symposium (2002).
[31]
Jacek Matulewski, Agnieszka Ignaczewska, Bibianna Bałaj, Iga Mościchowska, Joanna Dreszer, Rafał Linowiecki, and Wlodzislaw Duch. {n. d.}. The Markup Language for Designing the Applications Using the Gaze Tracking Devices. ({n. d.}). (in preparation).
[32]
Jakob Nielsen. 1993. Usability Engineering.
[33]
G. Norris and E. Wilson. 1997. The Eye Mouse, an eye communication device. Proceedings of the IEEE 23rd Northeast Bioengineering Conference (1997).
[34]
Marco Porta and Matteo Turina. 2008. Eye-S: a full-screen input modality for pure eye-based communication. In Proceedings of the 2008 Symposium on Eye Tracking Research & Applications.
[35]
Jeffrey S. Shell, Roel Vertegaal, Alexander W. Skaburskis, and Canada Kl. 2003. Eye-Pliances : Attention-Seeking Devices that Respond to Visual Attention. CHI '03 Extended Abstracts on Human Factors in Computing Systems (2003).
[36]
Statista. 2017. Average daily television viewing time in Germany from 1997 to 2016 in minutes. https://www.statista.com/statistics/380182/tv-consumption-viewing-time-germany/. (Accessed on 04/19/2018).
[37]
David J. Ward and David J.C. MacKay. 2002. Artificial intelligence: Fast hands-free writing by gaze direction. Nature (2002).
[38]
Benjamin Wassermann, Adrian Hardt, and Gottfried Zimmermann. 2012. Generic Gaze Interaction Events for Web Browsers Using the Eye Tracker as Input Device. WWW2012 Workshop Emerging Web Technologies Facing the Future of Education (2012).
[39]
Krista M. Wilkinson and Teresa Mitchell. 2014. Eye Tracking Research to Answer Questions about Augmentative and Alternative Communication Assessment and Intervention. Augmentative and Alternative Communication (2014).
[40]
World Health Organization and The World Bank. 2011. Disability - a global picture. World report on disability (2011).
[41]
Mary Zajicek. 2007. Web 2.0: hype or happiness? International cross-disciplinary conference on Web (2007).

Cited By

View all
  • (2024)Usability study of gaze-based control methods in a game with time pressureProcedia Computer Science10.1016/j.procs.2024.09.427246:C(473-481)Online publication date: 1-Jan-2024

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
COGAIN '18: Proceedings of the Workshop on Communication by Gaze Interaction
June 2018
69 pages
ISBN:9781450357906
DOI:10.1145/3206343
  • General Chairs:
  • Carlos Morimoto,
  • Thies Pfeiffer
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 15 June 2018

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. control of video playback
  2. eye tracking
  3. gaze gestures
  4. gaze-control
  5. motion disabled people

Qualifiers

  • Short-paper

Conference

ETRA '18

Upcoming Conference

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)13
  • Downloads (Last 6 weeks)0
Reflects downloads up to 10 Feb 2025

Other Metrics

Citations

Cited By

View all
  • (2024)Usability study of gaze-based control methods in a game with time pressureProcedia Computer Science10.1016/j.procs.2024.09.427246:C(473-481)Online publication date: 1-Jan-2024

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media