skip to main content
research-article

Establishing Vibration-Based Tactile Line Profiles for Use in Multimodal Graphics

Published:18 May 2020Publication History
Skip Abstract Section

Abstract

Vibration plays a significant role in the way users interact with touchscreens. For many users, vibration affords tactile alerts and other enhancements. For eyes-free users and users with visual impairments, vibration can also serve a more primary role in the user interface, such as indicating streets on maps, conveying information about graphs, or even specifying basic graphics. However, vibration is rarely used in current user interfaces beyond basic cuing. Furthermore, designers and developers who do actually use vibration more extensively are often unable to determine the exact properties of the vibration signals they are implementing, due to out-of-the-box software and hardware limitations. We make two contributions in this work. First, we investigate the contextual properties of touchscreen vibrations and how vibrations can be used to effectively convey traditional, embossed elements, such as dashes and dots. To do so, we developed an open source, Android-based library to generate vibrations that are perceptually salient and intuitive, improving upon existing vibration libraries. Second, we conducted a user study with 26 blind or visually impaired users to evaluate and categorize the effects with respect to traditional tactile line profiles. We have established a range of vibration effects that can be reliably generated by our haptic library and are perceptible and distinguishable by users.

References

  1. Zakaria Al-Qudah, Iyad Abu Doush, Faisal Alkhateeb, Esalm Al Maghayreh, and Osama Al-Khaleel. 2011. Reading braille on mobile phones: A fast method with low battery power consumption. In Proceedings of the 2011 International Conference on User Science and Engineering (i-USEr ’11). IEEE, Los Alamitos, CA, 118--123.Google ScholarGoogle ScholarCross RefCross Ref
  2. Google Android. 2019. Vibrator. Retrieved March 14, 2020 from https://developer.android.com/reference/android/os/Vibrator.Google ScholarGoogle Scholar
  3. Pew Research Center. 2019. Mobile Fact Sheet: Demographics of Mobile Device Ownership and Adoption in the United States. Retrieved March 14, 2020 from https://www.pewinternet.org/fact-sheet/mobile/.Google ScholarGoogle Scholar
  4. S. Choi and K. J. Kuchenbecker. 2013. Vibrotactile display: Perception, technology, and applications. Proceedings of the IEEE 101, 9 (2013), 2093--2104.Google ScholarGoogle ScholarCross RefCross Ref
  5. B. Chua and P. Mitchell. 2004. Consequences of amblyopia on education, occupation, and long term vision loss. British Journal of Ophthalmology 88, 9 (2004), 1119--1121.Google ScholarGoogle ScholarCross RefCross Ref
  6. D. D. Clark-Carter, A. D. Heyes, and C. I. Howarth. 1986. The efficiency and walking speed of visually impaired people. Ergonomics 29, 6 (1986), 779--789.Google ScholarGoogle ScholarCross RefCross Ref
  7. W. Erickson, C. Lee, and S. von Schrader. 2012. Disability Statistics from the 2011 American Community Survey (ACS). Retrieved March 14, 2020 from www.disabilitystatistics.org.Google ScholarGoogle Scholar
  8. American Printing House for the Blind. 2018. Graphiti. Retrieved March 14, 2020 from http://www.aph.org/graphiti/.Google ScholarGoogle Scholar
  9. George A. Gescheider. 2013. Psychophysics: The Fundamentals. Psychology Press.Google ScholarGoogle ScholarCross RefCross Ref
  10. George A. Gescheider, John H. Wright, and Ronald T. Verrillo. 2010. Information-Processing Channels in the Tactile Sensory System: A Psychophysical and Physiological Analysis. Psychology Press.Google ScholarGoogle Scholar
  11. N. A. Giudice, H. P. Palani, E. Brenner, and K. M. Kramer. 2012. Learning non-visual graphical information using a touch-based vibro-audio interface. In Proceedings of the 14th International ACM SIGACCESS Conference on Computers and Accessibility. ACM, New York, NY, 103--110.Google ScholarGoogle Scholar
  12. Bruce Goldstein. 2002. Sensation and Perception (6th ed.). Wadsworth-Thomson Learning, Belmont, CA.Google ScholarGoogle Scholar
  13. C. Goncu and K. Marriott. 2011. GraVVITAS: Generic multi-touch presentation of accessible graphics. In Proceedings of the Conference on Human-Computer Interaction. 30--48.Google ScholarGoogle Scholar
  14. J. L. Gorlewicz, J. Burgner, T. J. Withrow, and R. J. Webster III. 2014. Initial experiences using vibratory touchscreens to display graphical math concepts to students with visual impairments. Journal of Special Education Technology 29, 2 (2014), 17--25.Google ScholarGoogle ScholarCross RefCross Ref
  15. J. L. Gorlewicz, J. L. Tennison, H. P. Palani, and N. A. Giudice. 2018. The graphical access challenge for people with visual impairments: Positions and pathways forward. In Interactive Multimedia. IntechOpen.Google ScholarGoogle Scholar
  16. W. Grussenmeyer and E. Folmer. 2017. Accessible touchscreen technology for people with visual impairments: A survey. ACM Transactions on Accessible Computing 9, 2 (2017), Article 6, 31 pages.Google ScholarGoogle ScholarDigital LibraryDigital Library
  17. M. Hahn, C. Mueller, and J. L. Gorlewicz. The comprehension of stem graphics via a multi-sensory tablet in students with visual impairment. Journal of Visual Impairment and Blindness. Forthcoming.Google ScholarGoogle Scholar
  18. E. Hoggan, S. Anwar, and Stephen A. Brewster. 2007. Mobile multi-actuator tactile displays. In Proceedings of the International Workshop on Haptic and Audio Interaction Design. 22--33.Google ScholarGoogle Scholar
  19. E. Hoggan, S. A. Brewster, and J. Johnston. 2008. Investigating the effectiveness of tactile feedback for mobile touchscreens. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. ACM, New York, NY, 1573--1582.Google ScholarGoogle Scholar
  20. E. Hoggan, T. Kaaresoja, P. Laitinen, and S. A. Brewster. 2008. Crossmodal congruence: the look, feel and sound of touchscreen widgets. In Proceedings of the 10th International Conference on Multimodal Interfaces. ACM, New York, NY, 157--164.Google ScholarGoogle Scholar
  21. Immersion. 2019. Home Page. Retrieved March 14, 2020 from https://www.immersion.com/.Google ScholarGoogle Scholar
  22. Apple iOS. 2019. Haptics: User Interaction. Retrieved March 14, 2020 from https://developer.apple.com/design/human-interface-guidelines/ios/user-interaction/haptics/.Google ScholarGoogle Scholar
  23. C. Jayant, C. Acuario, W. Johnson, J. Hollier, and R. E. Ladner. 2010. V-braille: Haptic braille perception using a touch-screen and vibration on mobile phones. In Proceedings of the 12th International ACM SIGACCESS Conference on Computers and Accessibility (ASSETS ’10), Vol. 10. 295--296.Google ScholarGoogle Scholar
  24. R. S. Johansson and J. R. Flanagan. 2009. Coding and use of tactile signals from the fingertips in object manipulation tasks. Nature Reviews Neuroscience 10, 5 (2009), 345.Google ScholarGoogle ScholarCross RefCross Ref
  25. L. A. Jones and S. J. Lederman. 2006. Human Hand Function. Oxford University Press.Google ScholarGoogle Scholar
  26. L. A. Jones and N. B. Sarter. 2008. Tactile displays: Guidance for their design and application. Human Factors 50, 1 (2008), 90--111.Google ScholarGoogle ScholarCross RefCross Ref
  27. R. L. Klatzky, N. A. Giudice, C. R. Bennett, and J. M. Loomis. 2014. Touch-screen technology for the dynamic display of 2D spatial information without vision: Promise and progress. Multisensory Research 27 (2014), 359--378.Google ScholarGoogle ScholarCross RefCross Ref
  28. G. Martino and L. E. Marks. 2000. Cross-modal interaction between vision and touch: The role of synesthetic correspondence. Perception 29, 6 (2000), 745--754.Google ScholarGoogle ScholarCross RefCross Ref
  29. M. A. Nees and B. N. Walker. 2009. Auditory Interfaces and Sonification. Georgia Institute of Technology, Atlanta, GA.Google ScholarGoogle Scholar
  30. H. Nicolau, J. Guerreiro, T. Guerreiro, and L. Carriço. 2013. UbiBraille: Designing and evaluating a vibrotactile braille-reading device. In Proceedings of the 15th International ACM SIGACCESS Conference on Computers and Accessibility (ASSETS ’13). Article 23.Google ScholarGoogle Scholar
  31. S. R. Nyman, M. A. Gosney, and C. R. Victor. 2010. Psychosocial impact of visual impairment in working-age adults. British Journal of Ophthalmology 94, 11 (2010), 1427--1431.Google ScholarGoogle ScholarCross RefCross Ref
  32. A. M. Okamura, K. J. Kuchenbecker, and M. Mahvash. 2008. Measurement-based modeling for haptic rendering. In Haptic Rendering. AK Peters/CRC Press, Boca Raton, FL, 440--464.Google ScholarGoogle Scholar
  33. S. O’Modhrain, N. A. Giudice, J. A. Gardner, and G. E. Legge. 2015. Designing media for visually-impaired users of refreshable touch display: Possibilities and pitfalls. IEEE Transactions on Haptics 8, 3 (2015), 248--257.Google ScholarGoogle ScholarDigital LibraryDigital Library
  34. H. P. Palani and N. A. Giudice. 2014. Evaluation of non-visual panning operations using touch-screen devices. In Proceedings of the 16th International ACM SIGACCESS Conference on Computers and Accessibility. ACM, Los Alamitos, CA, 293--294.Google ScholarGoogle Scholar
  35. H. P. Palani, U. Giudice, and N. A. Giudice. 2016. Evaluation of non-visual zooming operations on touchscreen devices. In Proceedings of the International Conference on Universal Access in Human-Computer Interaction. 162--174.Google ScholarGoogle Scholar
  36. H. P. Palani, J. L. Tennison, G. B. Giudice, and N. A. Giudice. 2018. Touchscreen-based haptic information access for assisting blind and visually-impaired users: Perceptual parameters and design guidelines. In Proceedings of the International Conference on Applied Human Factors and Ergonomics. 837--847.Google ScholarGoogle Scholar
  37. A. E. C. Pensky, K. A. Johnson, S. Haag, and D. Homa. 2008. Delayed memory for visual-haptic exploration of familiar objects. Psychonomic Bulletin 8 Review 15, 3 (2008), 574--580.Google ScholarGoogle Scholar
  38. B. Poppinga, C. Magnusson, M. Pielot, and K. Rassmus-Gröhn. 2011. TouchOver map: Audio-tactile exploration of interactive maps. In Proceedings of the 13th International Conference on Human Computer Interaction with Mobile Devices and Services. ACM, New York, NY, 545--550.Google ScholarGoogle Scholar
  39. I. Poupyrev, S. Maruyama, and J. Rekimoto. 2002. Ambient touch: Designing tactile interfaces for handheld devices. In Proceedings of the 15th Annual ACM Symposium on User Interface Software and Technology. ACM, New York, NY, 51--60.Google ScholarGoogle Scholar
  40. J. Rantala, R. Raisamo, J. Lylykangas, V. Surakka, J. Raisamo, K. Salminen, T. Pakkanen, and A. Hippula. 2009. Methods for presenting braille characters on a mobile device with a touchscreen and tactile feedback. IEEE Transactions on Haptics 2, 1 (2009), 28--39.Google ScholarGoogle ScholarDigital LibraryDigital Library
  41. E. Ricciardi, D. Bonino, C. Gentili, L. Sani, P. Pietrini, and T. Vecchi. 2006. Neural correlates of spatial working memory in humans: A functional magnetic resonance imaging study comparing visual and tactile processes. Neuroscience 139, 1 (2006), 339--349.Google ScholarGoogle ScholarCross RefCross Ref
  42. W. M. Roth, G. M. Bowen, and M. K. McGinn. 1999. Differences in graph-related practices between high school biology textbooks and scientific ecology journals. Journal of Research in Science Teaching: The Official Journal of the National Association for Research in Science Teaching 36, 9 (1999), 977--1019.Google ScholarGoogle Scholar
  43. A. Russomanno, R. B. Gillespie, S. O’Modhrain, and M. Burns. 2015. The design of pressure-controlled valves for a refreshable tactile display. In Proceedings of the 2015 IEEE World Haptics Conference (WHC ’15). IEEE, Los Alamitos, CA, 177--182.Google ScholarGoogle Scholar
  44. J. Ryu, J. Jung, G. Park, and S. Choi. 2010. Psychophysical model for vibrotactile rendering in mobile devices. Presence: Teleoperators and Virtual Environments 19, 4 (2010), 364--387.Google ScholarGoogle ScholarDigital LibraryDigital Library
  45. Andreas Stefik and Merlin Uesbeck. 2020. AndroidHaptic Library (Java). Retrieved March 14, 2020 from https://bitbucket.org/stefika/androidquorum/src/master/AndroidHaptic/.Google ScholarGoogle Scholar
  46. Andreas Stefik and Merlin Uesbeck. 2020. Android Haptic Library (Quorum). Retrieved March 14, 2020 from https://bitbucket.org/stefika/quorum-language/src/master/Quorum/Library/Standard/Libraries/Interface/Vibration/.Google ScholarGoogle Scholar
  47. Andreas Stefik and Merlin Uesbeck. 2020. Android Vibration Tutorial. Retrieved March 14, 2020 from https://quorumlanguage.com/tutorials/mobile/vibration.html.Google ScholarGoogle Scholar
  48. J. C. Stevens, E. Foulke, and M. Q. Patterson. 1996. Tactile acuity, aging, and braille reading in long-term blindness. Journal of Experimental Psychology: Applied 2, 2 (1996), 91.Google ScholarGoogle ScholarCross RefCross Ref
  49. B. A. Swerdfeger, J. Fernquist, T. W. Hazelton, and K. E. MacLean. 2009. Exploring melodic variance in rhythmic haptic stimulus design. In Proceedings of Graphics Interface 2009. 133--140.Google ScholarGoogle Scholar
  50. H. Z. Tan, N. I. Durlach, R. M. Reed, and W. M. Rabinowitz. 1999. Information transmission with a multifinger tactual display. Perception 8 Psychophysics 61, 6 (1999), 993--1008.Google ScholarGoogle Scholar
  51. Blitab Technology. 2018. Home Page. Retrieved March 14, 2020 from http://blitab.com/.Google ScholarGoogle Scholar
  52. J. L. Tennison, Z. S. Carril, N. A. Giudice, and J. L. Gorlewicz. 2018. Comparing haptic pattern matching on tablets and phones: Large screens are not necessarily better. Optometry and Vision Science 95, 9 (2018), 720--726.Google ScholarGoogle ScholarCross RefCross Ref
  53. J. L. Tennison and J. L. Gorlewicz. 2016. Toward non-visual graphics representations on vibratory touchscreens: Shape exploration and identification. In Proceedings of the International Conference on Human Haptic Sensing and Touch Enabled Computer Applications. 384--395.Google ScholarGoogle Scholar
  54. J. L. Tennison and J. L. Gorlewicz. 2019. Non-visual perception of lines on a multimodal touchscreen tablet. ACM Transactions on Applied Perception 16, 1 (2019), Article 6.Google ScholarGoogle ScholarDigital LibraryDigital Library
  55. T. Thesen, J. F. Vibell, G. A. Calvert, and R. A. Österbauer. 2004. Neuroimaging of multisensory processing in vision, audition, touch, and olfaction. Cognitive Processing 5, 2 (2004), 84--93.Google ScholarGoogle ScholarCross RefCross Ref
  56. R. W. Van Boven, R. H. Hamilton, T. Kauffman, J. P. Keenan, and A. Pascual-Leone. 2000. Tactile spatial resolution in blind braille readers. Neurology 54, 12 (2000), 2230--2236.Google ScholarGoogle ScholarCross RefCross Ref
  57. J. van Erp and M. Spapé. 2003. Distilling the underlying dimensions of tactile melodies. In Proceedings of Eurohaptics, Vol. 2003. 111--120.Google ScholarGoogle Scholar
  58. Vital. 2019. Home Page. Retrieved March 14, 2020 from https://www.vital.education/.Google ScholarGoogle Scholar
  59. B. N. Walker and J. T. Cothran. 2003. Sonification Sandbox: A Graphical Toolkit for Auditory Graphs. Georgia Institute of Technology.Google ScholarGoogle Scholar
  60. B. N. Walker and L. M. Mauney. 2010. Universal design of auditory graphs: A comparison of sonification mappings for visually impaired and sighted listeners. ACM Transactions on Accessible Computing 2, 3 (2010), 12.Google ScholarGoogle ScholarDigital LibraryDigital Library
  61. K. Yatani and K. N. Truong. 2009. SemFeel: A user interface with semantic tactile feedback for mobile touch-screen devices. In Proceedings of the 22nd Annual ACM Symposium on User Interface Software and Technology. ACM, New York, NY, 111--120.Google ScholarGoogle Scholar

Index Terms

  1. Establishing Vibration-Based Tactile Line Profiles for Use in Multimodal Graphics

                Recommendations

                Comments

                Login options

                Check if you have access through your login credentials or your institution to get full access on this article.

                Sign in

                Full Access

                • Published in

                  cover image ACM Transactions on Applied Perception
                  ACM Transactions on Applied Perception  Volume 17, Issue 2
                  April 2020
                  82 pages
                  ISSN:1544-3558
                  EISSN:1544-3965
                  DOI:10.1145/3399405
                  Issue’s Table of Contents

                  Copyright © 2020 ACM

                  Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

                  Publisher

                  Association for Computing Machinery

                  New York, NY, United States

                  Publication History

                  • Published: 18 May 2020
                  • Online AM: 7 May 2020
                  • Accepted: 1 February 2020
                  • Revised: 1 January 2020
                  • Received: 1 September 2019
                  Published in tap Volume 17, Issue 2

                  Permissions

                  Request permissions about this article.

                  Request Permissions

                  Check for updates

                  Qualifiers

                  • research-article
                  • Research
                  • Refereed

                PDF Format

                View or Download as a PDF file.

                PDF

                eReader

                View online with eReader.

                eReader

                HTML Format

                View this article in HTML Format .

                View HTML Format