skip to main content
10.1145/2525194.2525304acmconferencesArticle/Chapter ViewAbstractPublication PageschiConference Proceedingsconference-collections
research-article

Eyeboard++: an enhanced eye gaze-based text entry system in Hindi

Published: 24 September 2013 Publication History

Abstract

Of late, eye gaze has become an important modality of text entry in large and small display digital devices. Despite many tools being developed, issues like minimizing dwell time and visual search time, enhancing accuracy of composed text, eye-controlled mouse movement stability etc. are yet to be addressed. Moreover, eye typing interfaces having a large number of keys suffer from many problems like selecting wrong characters, more character searching time etc. Some linguistic issues often decline in minimizing dwell time incurred for character by character based eye typing task. The aforementioned issues are prominently evolved in case of Indian languages for its many language related issues. In this paper, we propose a gaze-based text entry system EyeBoard++ for Hindi, national language of India which minimizes dwell time by introducing word completion and word prediction methodologies side by side mitigates visual search time by highlighting next probable characters. Performance evaluation shows that proposed interface achieves text entry rate on an average 9.63 words per minute. As designed, the proposed interface can effortlessly be suited in medium-sized display devices like Tablet PC, PDA etc. The proposed interface design approach, in fact, provides a solution to deal with complexity in Indian languages and can be extended to many other languages in the world. Also, the developed system can be used by the people with motor disabilities.

References

[1]
Agustin, J. S., Skovsgaard, H., Mollenbach, E., Barret, M., Tall, M., Hansen, D. W., and Hansen, J. P. Evaluation of a Low-Cost Open-Source Gaze Tracker. In Proceedings of the 2010 Symposium on Eye-Tracking Research & Applications, ACM (New York, NY, USA, 2010), 77--80.
[2]
CDAC. Indian Language Search Engine Technologies - Problems and Solutions, 2010. Available: http://iplugin.cdac.in/search-engine.htm, Accessed on September 2010.
[3]
CDAC. Problems with Existing Unicode Based Engines, 2010. Available: http://pune.cdac.in/html/gist/research-areas/set.aspx, Accessed on December 2010.
[4]
Collins, J. F., and Blackwell, L. K. Effects of Eye Dominance and Retinal Distance on Binocular Rivalry. Perceptual Motor Skills 39 (1974), 747--754.
[5]
Consortium, U. South Asian Scripts-I, 2011. Available: http://www.unicode.org/versions/Unicode5.0.0/ch09.pdf, Accessed on January 2011.
[6]
Drewes, H., Luca, A. D., and Schmidt, A. Eye-gaze Interaction for Mobile Phones. In Proceedings of the Mobility Conference, ACM (2007), 364--371.
[7]
Ghosh, P. K., and Knuth, D. E. An Approach to Type Design and Text Composition in Indian Scripts. PhD thesis, Stanford University, 1983.
[8]
Gong, J., Haggerty, B., and Tarasewich, P. An Enhanced Multitap Text Entry Method with Predictive Next-letter Highlighting. In Extended Abstracts on Human Factors in Computing Systems, ACM (New York, NY, USA, 2005), 1399--1402.
[9]
Hansen, J. P., Hansen, D. W., and Johansen, A. S. Bringing Gaze-based Interaction back to Basics (2001). 325--328.
[10]
Ishida, R. An Introduction to Writing Systems & Unicode: A review of script Characteristics Affecting Computer-based Script Support and Unicode. Available: http://people.w3.org/rishida/docs/unicode-tutorial, 2010. Accessed on January 2011.
[11]
Jacob, R. J. K. The Use of Eye Movements in Human-computer Interaction Techniques: What You Look at is What You Get. ACM Transactions on Information Systems 9, 2 (1991), 152--169.
[12]
Joshi, A., Dalvi, G., Joshi, M., Rashinkar, P., and Sarangdhar, A. Design and Evaluation of Devanagari Virtual Keyboards for Touch Screen Mobile Phones. In Proceedings of MobileHCI, ACM (New York, NY, USA, 2011), 323--332.
[13]
Kristensson, P. O. Five Challenges for Intelligent Text Entry Methods. AI Magazine 30, 4 (2009), 85--94.
[14]
Liang, Z., Fu, Q., and Chi, Z. Eye Typing of Chinese Characters. In Proceedings of the Symposium on Eye Tracking Research and Applications, ETRA '12, ACM (New York, NY, USA, 2012), 237--240.
[15]
MacKenzie, I. S., and Tanaka-Ishii, K. Text Entry Systems: Mobility, Accessibility, Universality. Morgan Kaufmann Inc., MA, USA, 2007.
[16]
MacKenzie, I. S., and Zhang, X. Eye Typing Using Word and Letter Prediction and a Fixation Algorithm. In Proceedings of the 2008 symposium on Eye tracking research & applications, ACM (Savannah, GA, USA, 2008), 55--58.
[17]
Magnien, L., Bouraoui, J., and Vigouroux, N. Mobile Text Input with Soft Keyboards: Optimization by Means of Visual Clues. In Mobile Human-Computer Interaction, ACM (New York, NY, USA, 2004), 197--218.
[18]
Majaranta, P. Text Entry by Eye Gaze. PhD thesis, Department of Computer Science, 2009.
[19]
Majaranta, P., Aula, A., and Räihä, K. J. Effects of Feedback on Eye Typing with a Short Dwell Time. In Proceedings of the 2004 symposium on Eye tracking research & applications, ACM (2004), 139--146.
[20]
Majaranta, P., and Räihä, K. J. Twenty Years of Eye Typing: Systems and Design Issues. In Proceedings of the Symposium on Eye Tracking Research & Applications, ACM (2002), 15--22.
[21]
Majaranta, P., and Räihä, K. J. Text Entry Systems: Mobility, accessibility, universality. Eds. Morgan Kaufmann, San Francisco, CA, 2007, ch. Text Entry by Gaze: Utilizing Eye-tracking, 175--187.
[22]
Miniotas, D., Spakov, O., and Evreinov, G. Symbol Creator: An Alternative Eye-based Text Entry Technique with Low Demand for Screen Space. In Proceedings of INTERACT (2003), 137--143.
[23]
Panwar, P., Sarcar, S., and Samanta, D. EyeBoard: A Fast and Accurate Eye Gaze-based Text Entry System. In Proceedings of IHCI, IEEE (2012), 1--8.
[24]
Pomplun, M., Reingold, E. M., and Shen, J. Area Activation: A computational Model of Saccadic Selectivity in Visual Search. Cognitive Science 27, 2 (2003), 299--312.
[25]
Saha, P. K., Samanta, D., Sarcar, S., and Sharma, M. L. Analysis of Visual Search Features. International Journal of Human Factors Modelling and Simulation 3, 1 (2012), 66--89.
[26]
Samanta, D., Sarcar, S., and Ghosh, S. An Approach to Design Virtual Keyboards for Text Composition in Indian Languages. International Journal of Human Computer Interaction 29, 8 (2013), 516--540.
[27]
Sears, S., Jacko, J. A., Chu, J. Y. M., and Moro, F. The role of visual search in the design of effective soft keyboards. Behaviour & Information Technology 20, 3 (2001), 159--166.
[28]
Sharma, M. K. Word Prediction System with Virtual Keyboard for Text Entry in Hindi. M.s. thesis, School of Information Technology, Indian Institute of Technology, Kharagpur, 2012.
[29]
Shilpa. Swathanthra Indian Language Computing Project. Available: http://smc.org.in/silpa/Soundex, Accessed on March 2012.
[30]
SLM, C. The CMU Statistical Language Modeling (SLM) Toolkit. Available: http://homepages.inf.ed.ac.uk/lzhang10/slm.html, Accessed on January 2010.
[31]
Soukoreff, R. W., and MacKenzie, I. S. Metrics for text entry research: an evaluation of msd and kspc, and a new unified error metric. In Proceedings of the conference on Human factors in computing systems, ACM (2003), 113--120.
[32]
Špakov, O., and Majaranta, P. Scrollable Keyboards for Eye Typing. In Proceedings of the 4th Annual Conference on Communication by Gaze Interaction (Prague, Czech Republic, 2008), 63--66.
[33]
Špakov, O., and Miniotas, D. On-line Adjustment of Dwell Time for Target Selection by Gaze. In Proceedings of the third Nordic conference on Human-computer interaction, ACM (2004), 203--206.
[34]
Thottingal, S. Soundex codes for Indic languages. Available: http://thottingal.in/soundex/soundex.html, Accessed on March 2012.
[35]
Urbina, M. H., and Huckauf, A. Dwell Time Free Eye Typing Approaches. In Proceedings of the 3rd Conference on Communication by Gaze Interaction (2007), 3--4.
[36]
Wobbrock, J. O., and Myers, B. A. Analyzing the input stream for character-level errors in unconstrained text entry evaluations. ACM Transactions on Computer-Human Interaction (TOCHI) 13, 4 (2006), 458--489.
[37]
Wobbrock, J. O., Rubinstein, J., Sawyer, M. W., and Duchowski, A. T. Longitudinal Evaluation of Discrete Consecutive Gaze Gestures for Text Entry. In Proceedings of the Symposium on Eye Tracking Research & applications, ACM (2008), 11--18.
[38]
Wolfe, J. M. Guided Search 2.0 - A Revised Model of Visual Search. Psychonomic bulletin & review 1, 2 (1994), 202--238.

Cited By

View all
  • (2025)Could You Hear That? Identifying Marathi Phrases Suitable for Aural Transcription TasksHuman-Computer Interaction. Design and Research10.1007/978-3-031-80829-6_4(70-103)Online publication date: 14-Feb-2025
  • (2023)Gaze Tracking for Hands-Free Human Using Deep Reinforcement Learning ApproachJournal of Smart Internet of Things10.2478/jsiot-2023-00132023:2(105-114)Online publication date: 15-Dec-2023
  • (2020)Multimodal Gaze Interaction for Creative DesignProceedings of the 2020 CHI Conference on Human Factors in Computing Systems10.1145/3313831.3376196(1-13)Online publication date: 21-Apr-2020
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
APCHI '13: Proceedings of the 11th Asia Pacific Conference on Computer Human Interaction
September 2013
420 pages
ISBN:9781450322539
DOI:10.1145/2525194
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 24 September 2013

Permissions

Request permissions for this article.

Check for updates

Qualifiers

  • Research-article

Conference

APCHI '13
Sponsor:

Upcoming Conference

CHI 2025
ACM CHI Conference on Human Factors in Computing Systems
April 26 - May 1, 2025
Yokohama , Japan

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)9
  • Downloads (Last 6 weeks)2
Reflects downloads up to 05 Mar 2025

Other Metrics

Citations

Cited By

View all
  • (2025)Could You Hear That? Identifying Marathi Phrases Suitable for Aural Transcription TasksHuman-Computer Interaction. Design and Research10.1007/978-3-031-80829-6_4(70-103)Online publication date: 14-Feb-2025
  • (2023)Gaze Tracking for Hands-Free Human Using Deep Reinforcement Learning ApproachJournal of Smart Internet of Things10.2478/jsiot-2023-00132023:2(105-114)Online publication date: 15-Dec-2023
  • (2020)Multimodal Gaze Interaction for Creative DesignProceedings of the 2020 CHI Conference on Human Factors in Computing Systems10.1145/3313831.3376196(1-13)Online publication date: 21-Apr-2020
  • (2019)Design and evaluation of a time adaptive multimodal virtual keyboardJournal on Multimodal User Interfaces10.1007/s12193-019-00293-z13:4(343-361)Online publication date: 8-Feb-2019
  • (2018)Toward Optimization of Gaze-Controlled Human–Computer Interaction: Application to Hindi Virtual Keyboard for Stroke PatientsIEEE Transactions on Neural Systems and Rehabilitation Engineering10.1109/TNSRE.2018.281482626:4(911-922)Online publication date: Apr-2018
  • (2017)Improving Dwell-Based Gaze Typing with Dynamic, Cascading Dwell TimesProceedings of the 2017 CHI Conference on Human Factors in Computing Systems10.1145/3025453.3025517(2558-2570)Online publication date: 2-May-2017
  • (2017)Inclusive Personlization of User InterfacesResearch into Design for Communities, Volume 110.1007/978-981-10-3518-0_26(295-306)Online publication date: 26-Feb-2017
  • (2015)A Protocol to Evaluate Virtual Keyboards for Indian LanguagesProceedings of the 7th Indian Conference on Human-Computer Interaction10.1145/2835966.2835970(27-38)Online publication date: 17-Dec-2015
  • (2014)Eye-gaze Tracking Based Interaction in IndiaProcedia Computer Science10.1016/j.procs.2014.11.01039(59-66)Online publication date: 2014
  • (2014)New InterfacesInclusive Human Machine Interaction for India10.1007/978-3-319-06500-7_4(63-77)Online publication date: 19-Jun-2014

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media