ABSTRACT
Mathematics instruction and discourse typically involve two modes of communication: speech and graphical presentation. For the communication to remain situated, dynamic synchrony must be maintained between the speech and dynamic focus in the graphics. In normals, vision is used for two purposes: access to graphical material and awareness of embodied behavior. This embodiment awareness keeps communication situated with visual material and speech. Our goal is to assist blind students in the access to such instruction/communication. We employ the typical approach of sensory replacement for the missing visual sense. Haptic fingertip reading can replace visual material. For the embodied portion of the communication, we want to make the blind student aware of the deictic gestures performed by the teacher over the graphic in conjunction with speech. We propose the use of haptic gloves paired with computer vision based tracking to help blind students maintain reading focus on a raised line representation of a graphical presentation to which the instructor points while speaking. In this initial phase of our research, we conducted three experiments that show that: 1) The gloves convey sense of direction; 2) The gloves do not interfere in fingertip reading; 3) A person can navigate with the help of this system while listening to a story; 4) It is possible to fuse the information received from both modes. We discuss these findings in this paper.
- J.-Y. Bouguet. Pyramidal implementation of the lucas kanade feature tracker. Technical report, Intel Corporation, 2000.Google Scholar
- C. Bühler. The deictic field of language and deictic words. In R. Jarvella and W. Klein, editors, Speech, Place, and Action, pages 9--30. John Wiley and Sons, London, 1982.Google Scholar
- H. Clark. Using Language. Cambridge: Cambridge University Press, 1996.Google Scholar
- H. Clark and H. Brownell. Judging up and down. Journal of Experimental Psychology: Human Perception and Performance, 1(4):339--352, 1975.Google ScholarCross Ref
- A. F. for the Blind. Educational attainment. American Foundation for the Blind, American Foundation for the Blind, 2008. http://www.afb.org.Google Scholar
- E. Foulk and J. Warm. Effects of complexity and redundancy on the tactual recognition of metric figures. Percept Mot Skills, 25:177--187, 1967.Google ScholarCross Ref
- A. Grant, T. MC, and S. K. Tactile perception in blind braille readers: a psychophysical study of acuity and hyperacuity using gratings and dot patterns. Percept Psychophys, 62:301--312, 2000.Google ScholarCross Ref
- R. N. Haber, L. R. Haber, C. A. Levin, and R. Hollyfield. Properties of spatial representations: Data from sighted and blind subjects. Perception and Psychophysics, 54:1--13, 1993.Google ScholarCross Ref
- E. Hinrichs and L. Polanyi. Pointing the way: A unified treatment of referential gesture in interactive discourse. Papers from the Parasession on Pragmatics and Grammatical Theory, Chicago Linguistics Society, 22nd Meeting:71--78, 1986.Google Scholar
- M. Hollins. Understanding blindness. Lawrence Erlbaum Associates, New Jersey, 1989.Google Scholar
- C. Intel. Open cv, 05/30/2007 2006.Google Scholar
- W. Joeong and M. Gluck. Multimodal bivariate thematic maps with auditory and haptic display. In Proceedings of the 2002 International Conference on Auditory Display, Kyoto, Japan, 2002.Google ScholarCross Ref
- R. Jonhansson and A. Valbo. Tactile sensibility in the human hand: relative and absolute densities of four types of mechanoreceptive units in glabrous skin. The Journal of Physiology, 286:283--300, 1979.Google ScholarCross Ref
- J. M. Kennedy. Drawing and the Blind. Yale Press, New Haven, CT, 1993.Google Scholar
- B. Landau, E. Spelke, and H. Gleitman. Spatial knowledge in a young blind child. Cognition, 16:225--260, 1984.Google ScholarCross Ref
- B. Lucas and T. Kanade. An iterative image registration technique with an application to stereo vision. In Proc. of 7th International Joint Conference on Artificial Intelligence, pages 674--679, 1986.Google Scholar
- D. McNeill. Hand and Mind: What Gestures Reveal about thought. U. of Chicago Press, Chicago, 1992.Google Scholar
- S. Millar. Movement cues and body orientation in recall of locations by blind and sighted children. Quarterly Journal of Psychology, A37:257--279, 1985.Google Scholar
- N. F. of the Blind. Blindness statistics. National Federation of the Blind, 2008. http://www.nfb.org.Google Scholar
- A. Pascual-Leone and F. Torres. Plasticity of the sensorimotor cortex representation of the reading finger in braille readers. Brain, 116(1):2230--2236, 1993.Google ScholarCross Ref
- F. Quek, D. McNeill, and F. Oliveira. Enabling multimodal communications for enhancing the ability of learning for the visually impaired. In ICMI '06: Proceedings of the 8th international conference on Multimodal interfaces, pages 326--332. ACM, 2006. Google ScholarDigital Library
- K. Sathian. Practice makes perfect: Sharper tactile perception in the blind. Neurology, 54(12): Editorial, 2000.Google Scholar
- J. Stevens, E. Foulk, and M. Patterson. Tactile acuity, aging and braille reading in long-term blindness. J Exp Psychol Applied, 2:91--96, 1996.Google ScholarCross Ref
- C. Supalo. Techniques To Enhance Instructors' Teaching Effectiveness with Chemistry Students Who Are Blind or Visually Impaired. Journal of Chemical Education, 82(10):1513, 2005.Google ScholarCross Ref
- R. Van Boven, R. Hamilton, T. Kauffman, J. Keenan, and A. Pascual-Leone. Tactile spatial resolution in blind braille readers. Neurology, 54:2230--2236, 2000.Google ScholarCross Ref
- C. Wickens and J. Hollands. Engineering Psychology and Human Performance. Prentice Hall, 2001.Google Scholar
- J. Williams. Nationwide shortage of teachers for blind students must be corrected. National Federation of the Blind: Advocates for Equality, Canadian Blind Monitor, 14, 2002. http://www.nfbae.ca/publications/cbm14/cbm14.php?article=27.Google Scholar
Index Terms
- A multimodal communication with a haptic glove: on the fusion of speech and deixis over a raised line drawing
Recommendations
Enabling multimodal communications for enhancing the ability of learning for the visually impaired
ICMI '06: Proceedings of the 8th international conference on Multimodal interfacesStudents who are blind are typically one to three years behind their seeing counterparts in mathematics and science. We posit that a key reason for this resides in the inability of such students to access multimodal embodied communicative behavior of ...
Multimodal multiparty social interaction with the furhat head
ICMI '12: Proceedings of the 14th ACM international conference on Multimodal interactionWe will show in this demonstrator an advanced multimodal and multiparty spoken conversational system using Furhat, a robot head based on projected facial animation. Furhat is a human-like interface that utilizes facial animation for physical robot heads ...
Multimodal human discourse: gesture and speech
Gesture and speech combine to form a rich basis for human conversational interaction. To exploit these modalities in HCI, we need to understand the interplay between them and the way in which they support communication. We propose a framework for the ...
Comments