skip to main content
10.1145/3341162.3349308acmconferencesArticle/Chapter ViewAbstractPublication PagesubicompConference Proceedingsconference-collections
research-article

Gaze assisted voice note taking system

Published: 09 September 2019 Publication History

Abstract

Note-taking, an active learning strategy, has a long history in the educational setting. Learners would tend to make notes for a variety of reasons such as planning their learning activities, extracting useful information from the learning content and to reflect their understanding about learning material [9]. The research studies conducted in educational domain support that note-taking can help learners think, understand and create their knowledge, thereby increasing their learning performance. Although note-taking is crucial for learning, this activity is highly complex which requires comprehension and selection of information. As a result, resource demanding cognitive operations are triggered which coordinate in rapid succession thereby making the activity of note-taking cognitively effortful [12]. One other drawback of this activity is the split attention effect induced when learners split their attention between the act of writing notes and attending to learning content [10]. This results in the withdrawal of attention from the global context to writing selective information which can thereby increase cognitive load and adversely affect the learning performance.

References

[1]
Yomna Abdelrahman, Eduardo Velloso, Tilman Dingler, Albrecht Schmidt, and Frank Vetere. 2017. Cognitive Heat: Exploring the Usage of Thermal Imaging to Unobtrusively Estimate Cognitive Load. Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies 1 (09 2017), 1--20.
[2]
O. Augereau, H. Fujiyoshi, and K. Kise. 2016. Towards an automated estimation of English skill via TOEIC score based on reading analysis. In 2016 23rd International Conference on Pattern Recognition (ICPR). 1285--1290.
[3]
Shiwei Cheng, Zhiqiang Sun, Lingyun Sun, Kirsten Yee, and Anind Dey. 2015. Gaze-Based Annotations for Reading Comprehension. 1569--1572.
[4]
Andreas Dengel, Ralf Biedert, JÄűrn Hees, and Georg Buscher. 2012. A Robust Realtime Reading-Skimming Classifier. Eye Tracking Research and Applications Symposium (ETRA).
[5]
Joachim Grabowski. 2005. Speaking, writing, and memory span performance: Replicating the Bourdin and Fayol results on cognitive load in German children and adults. (01 2005).
[6]
Stephanos Ioannou, Sjoerd Ebisch, Tiziana Aureli, Daniela Bafunno, Helene Alexi Ioannides, Daniela Cardone, Barbara Manini, Gian Luca Romani, Vittorio Gallese, and Arcangelo Merla. 2013. The autonomic signature of guilt in children: a thermal infrared imaging study. PloS one 8, 11 (2013), e79440.
[7]
Stephanos Ioannou, Vittorio Gallese, and Arcangelo Merla. 2014. Thermal infrared imaging in psychophysiology: potentialities and limits. Psychophysiology 51, 10 (2014), 951--963.
[8]
RenÄľe S. Jansen, Daniel Lakens, and Wijnand A. IJsselsteijn. 2017. An integrative review of the cognitive costs and benefits of note-taking. Educational Research Review 22 (2017), 223 -- 233.
[9]
Lotfollah Karimi. 2011. Note-Taking in the Mirror of Literature: Theory and Practice. World Applied Sciences Journal 15 (01 2011).
[10]
Rae Lynne Mancilla. {n. d.}. Getting Smart about Split Attention. Computer-Assisted Foreign Language Teaching and Learning ({n. d.}), 210âĂŞ229.
[11]
Ayano Okoso, Kai Kunze, and Koichi Kise. 2014. Implicit Gaze based Annotations to Support Second Language Learning. UbiComp 2014 - Adjunct Proceedings of the 2014 ACM International Joint Conference on Pervasive and Ubiquitous Computing, 283--286.
[12]
Annie Piolat, Thierry Olive, and Ronald T. Kellogg. 2005. Cognitive effort during note taking. Applied Cognitive Psychology 19, 3 (2005), 291âĂŞ312.
[13]
Dan R. Olsen, Trent Taufer, and Jerry Fails. 2004. ScreenCrayons: annotating anything. 165--174.
[14]
Benjamin Tag, Ryan Mannschreck, Kazunori Sugiura, George Chernyshov, Naohisa Ohta, and Kai Kunze. 2017. Facial Thermography for Attention Tracking on Smart Eyewear: An Initial Study. In Proceedings of the 2017 CHI Conference Extended Abstracts on Human Factors in Computing Systems (CHI EA '17). ACM, New York, NY, USA, 2959--2966.
[15]
A. Vinciarelli, N. Suditu, and M. Pantic. 2009. Implicit Human-Centered Tagging. 2009 IEEE International Conference on Multimedia and Expo (2009).
[16]
K. Yoshimura, K. Kise, and K. Kunze. 2015. The eye as the window of the language ability: Estimation of English skills by analyzing eye movement while reading documents. In 2015 13th International Conference on Document Analysis and Recognition (ICDAR). 251--255.

Cited By

View all

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
UbiComp/ISWC '19 Adjunct: Adjunct Proceedings of the 2019 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2019 ACM International Symposium on Wearable Computers
September 2019
1234 pages
ISBN:9781450368698
DOI:10.1145/3341162
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 09 September 2019

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. eye tracking
  2. implicit tagging
  3. machine learning
  4. note-taking

Qualifiers

  • Research-article

Funding Sources

Conference

UbiComp '19

Acceptance Rates

Overall Acceptance Rate 764 of 2,912 submissions, 26%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • 0
    Total Citations
  • 226
    Total Downloads
  • Downloads (Last 12 months)14
  • Downloads (Last 6 weeks)0
Reflects downloads up to 20 Jan 2025

Other Metrics

Citations

Cited By

View all

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media