skip to main content
10.1145/1517744.1517808acmotherconferencesArticle/Chapter ViewAbstractPublication PagesozchiConference Proceedingsconference-collections
research-article

A collaborative multimodal handwriting training environment for visually impaired students

Published: 08 December 2008 Publication History

Abstract

The spatial motor skills used for handwriting are particularly difficult for visually impaired people to develop. These skills are required in order to sign an aesthetically pleasing and repeatable signature, which is often required for documents such as legal agreements and job applications. Our multimodal system with haptic guidance, sonification and tactile feedback is designed to assist when teaching visually impaired students to form letters, and eventually, a signature. As tactile technologies become commonplace, appearing even in mobile phones, our system may also provide useful insight into the use of nonvisual feedback for a variety of applications.

References

[1]
Amirabdollahian, F., Loureiro, R. and Harwin, W. Minimum Jerk Trajectory Control for Rehabilitation and Haptic Applications. In Proceedings of the IEEE International Conference on Robotics & Automation, (Washington DC, 2002), 3380--3385.
[2]
Arter, C., McCall, S., Bowyer, T. Handwriting and children with visual impairments British Journal of Special Education 23, 1 (1996), 25--28.
[3]
Astrom, K. J. and Hagglund, T. PID Controllers. International Society for Measurement and Control, Research Triangle Park, N.C., 1995.
[4]
Crossan, A. and Brewster, S. Multimodal Trajectory Playback for Teaching Shape Information and Trajectories to Visually Impaired Computer Users. Transactions on Accessible Computing, 2008.
[5]
Crossan, A., Williamson, J. and Brewster, S. A General Purpose Control-Based Playback for Force Feedback Systems. In Proceedings of Eurohaptics 2006 (Paris, France, 2006).
[6]
Dang, T., Annaswamy, T. M. and Srinivasan, M. A. Development and Evaluation of an Epidural Injection Simulator with Force Feedback for Medical Training. In Proceedings of Medicine Meets Virtual Reality, (USA, 2001), IOS Press, 97--102.
[7]
Feygin, D., Keehner, M. and Tendick, F. Haptic Guidance: Experimental Evaluation of a Haptic Training Method for a Perceptual Motor Skill. In Haptic Interfaces for Virtual Environment and Teleoperator Systems, (Florida, 2002), IEEE Computer Society, 40--47.
[8]
Green, T. R. G. and Blackwell, A. F. Cognitive dimensions of information artefacts: a tutorial. http://www.cl.cam.ac.uk/~afb21/CognitiveDimensions/CDtutorial.pdf (1998)
[9]
Green, T. R. G. & Petre, M. Usability analysis of visual programming environments: a 'cognitive dimensions' framework. Journal of Visual Languages and Computing 2, 7 (1996), 131--174.
[10]
Henmi, K. and Yoshikawa, T. Virtual lesson and its application to virtual calligraphy system. In Proceedings of IEEE International Conference on Robotics and Automation, (1998), 1275--1280.
[11]
Hoggan, E. and Brewster, S. Designing audio and tactile crossmodal icons for mobile devices. In Proceedings of the 9th international Conference on Multimodal interfaces (Nagoya, Japan, 2007), ACM, 162--169.
[12]
Horstmann, M., Hagen, C., King, A., Dijkstra, S., Crombie, D., Evans, G., Ioannidis, G. T., Blenkhorn, P., Herzog, O. and Schlieder, C. TeDUB: Automatic interpretation and presentation of technical diagrams for blind people. In Proceedings of Conference and Workshop on Assistive Technologies for Vision and Hearing Impairment - CVHI'2004, EURO-ASSIST-VHI-2: Accessibility, Mobility and Social Integration, (Granada, Spain, 2004).
[13]
Kurze, M. TDraw: A Computer-based Tactile Drawing Tool for Blind People. In Proceedings of ACM ASSETS '96 (Vancouver, Canada, 1996), 131--138.
[14]
Landua, S. and Wells, L. Merging Tactile Sensory Input and Audio Data by Means of the Talking Tactile Tablet. In Proceedings of Eurohaptics 2003, (Dublin, Ireland, 2003), 414--418.
[15]
Mullins, J., Mawson, C., Nahavandi, S. Haptic handwriting aid for training and rehabilitation. In Proceedings of IEEE International Conference on Systems, Man and Cybernetics, (2005), 2690--2694
[16]
Plimmer, B., Crossan, A., Brewster, S., Blagojevic, R. Multimodal collaborative handwriting training for visually-impaired people. In Proceedings of the Twenty-Sixth Annual SIGCHI Conference on Human Factors in Computing Systems (Florence, Italy 2008), ACM, 393--402.
[17]
Rassmus-Gröhn, K., Magnusson, C. and Eftring, H. User evaluations of a virtual haptic-audio line drawing prototype. In Proceedings of Haptic and Audio Interaction Design (HAID2006), (University of Glasgow, UK, 2006), LNCS vol 4129, 81--91.
[18]
Sassoon, R. Handwriting: The Way to Teach It. Second Edition. Paul Chapman, London, 2003.
[19]
SensAble Technologies www.sensable.com
[20]
Taylor, T. Handwriting: Multisensory Approaches to Assessing and Improving Handwriting Skills, David Fulton, London, 2001.
[21]
Teo, C., Burdet, E. and Lim, H. A Robotic Teacher of Chinese Handwriting. In Proceedings of the Symposium for Haptic Interfaces for Virtual Environment and Teleoperator Systems, (2002), 335--341.
[22]
Wall, S. and Brewster, S. Feeling what you hear: tactile feedback for navigation of audio graphs. In Proceedings of ACM CHI 2006, (Montreal, Canada, 2006), ACM Press Addison-Wesley, 1123--1132.
[23]
Yokokohji, Y., Hollis, R. L., Kanade, T., Henmi, K. and Yoshikawa, T. Toward Machine Mediated Training of Motor Skills -Skill Transfer from Human to Human via Virtual Environment. In Proceedings of RO-MAN, (Japan, 1996), 32--37.
[24]
Yu, W. and Brewster, S. A. Evaluation of Multimodal Graphs for Blind People. Journal of Universal Access in the Information Society 2, 2 (2003), 105--124.

Cited By

View all
  • (2024)A Multilingual Handwriting Learning System for Visually Impaired PeopleIEEE Access10.1109/ACCESS.2024.335378112(10521-10534)Online publication date: 2024
  • (2023)The Robot Made Us Hear Each OtherProceedings of the 2023 ACM/IEEE International Conference on Human-Robot Interaction10.1145/3568162.3576997(13-23)Online publication date: 13-Mar-2023
  • (2021)Fostering Inclusive Activities in Mixed-visual Abilities Classrooms using Social RobotsCompanion of the 2021 ACM/IEEE International Conference on Human-Robot Interaction10.1145/3434074.3446356(571-573)Online publication date: 8-Mar-2021
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Other conferences
OZCHI '08: Proceedings of the 20th Australasian Conference on Computer-Human Interaction: Designing for Habitus and Habitat
December 2008
366 pages
ISBN:0980306345
DOI:10.1145/1517744
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 08 December 2008

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. handwriting
  2. haptic guidance
  3. signature
  4. sonification
  5. tactile
  6. visually-impaired

Qualifiers

  • Research-article

Conference

OZCHI '08

Acceptance Rates

OZCHI '08 Paper Acceptance Rate 28 of 57 submissions, 49%;
Overall Acceptance Rate 362 of 729 submissions, 50%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)2
  • Downloads (Last 6 weeks)0
Reflects downloads up to 03 Mar 2025

Other Metrics

Citations

Cited By

View all
  • (2024)A Multilingual Handwriting Learning System for Visually Impaired PeopleIEEE Access10.1109/ACCESS.2024.335378112(10521-10534)Online publication date: 2024
  • (2023)The Robot Made Us Hear Each OtherProceedings of the 2023 ACM/IEEE International Conference on Human-Robot Interaction10.1145/3568162.3576997(13-23)Online publication date: 13-Mar-2023
  • (2021)Fostering Inclusive Activities in Mixed-visual Abilities Classrooms using Social RobotsCompanion of the 2021 ACM/IEEE International Conference on Human-Robot Interaction10.1145/3434074.3446356(571-573)Online publication date: 8-Mar-2021
  • (2021)Community Based Robot Design for Classrooms with Mixed Visual Abilities ChildrenProceedings of the 2021 CHI Conference on Human Factors in Computing Systems10.1145/3411764.3445135(1-12)Online publication date: 6-May-2021
  • (2020)Using tabletop robots to promote inclusive classroom experiencesProceedings of the Interaction Design and Children Conference10.1145/3392063.3394439(281-292)Online publication date: 21-Jun-2020
  • (2010)Improving end-user GUI customization with transclusionProceedings of the Thirty-Third Australasian Conferenc on Computer Science - Volume 10210.5555/1862199.1862217(163-172)Online publication date: 1-Jan-2010
  • (2010)Computer Aided Calligraphy in Haptic Virtual EnvironmentProceedings of the 2010 International Conference on Computational Science and Its Applications10.1109/ICCSA.2010.41(103-110)Online publication date: 23-Mar-2010

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media