skip to main content
research-article

Establishing Vibration-Based Tactile Line Profiles for Use in Multimodal Graphics

Published: 18 May 2020 Publication History

Abstract

Vibration plays a significant role in the way users interact with touchscreens. For many users, vibration affords tactile alerts and other enhancements. For eyes-free users and users with visual impairments, vibration can also serve a more primary role in the user interface, such as indicating streets on maps, conveying information about graphs, or even specifying basic graphics. However, vibration is rarely used in current user interfaces beyond basic cuing. Furthermore, designers and developers who do actually use vibration more extensively are often unable to determine the exact properties of the vibration signals they are implementing, due to out-of-the-box software and hardware limitations. We make two contributions in this work. First, we investigate the contextual properties of touchscreen vibrations and how vibrations can be used to effectively convey traditional, embossed elements, such as dashes and dots. To do so, we developed an open source, Android-based library to generate vibrations that are perceptually salient and intuitive, improving upon existing vibration libraries. Second, we conducted a user study with 26 blind or visually impaired users to evaluate and categorize the effects with respect to traditional tactile line profiles. We have established a range of vibration effects that can be reliably generated by our haptic library and are perceptible and distinguishable by users.

References

[1]
Zakaria Al-Qudah, Iyad Abu Doush, Faisal Alkhateeb, Esalm Al Maghayreh, and Osama Al-Khaleel. 2011. Reading braille on mobile phones: A fast method with low battery power consumption. In Proceedings of the 2011 International Conference on User Science and Engineering (i-USEr ’11). IEEE, Los Alamitos, CA, 118--123.
[2]
Google Android. 2019. Vibrator. Retrieved March 14, 2020 from https://developer.android.com/reference/android/os/Vibrator.
[3]
Pew Research Center. 2019. Mobile Fact Sheet: Demographics of Mobile Device Ownership and Adoption in the United States. Retrieved March 14, 2020 from https://www.pewinternet.org/fact-sheet/mobile/.
[4]
S. Choi and K. J. Kuchenbecker. 2013. Vibrotactile display: Perception, technology, and applications. Proceedings of the IEEE 101, 9 (2013), 2093--2104.
[5]
B. Chua and P. Mitchell. 2004. Consequences of amblyopia on education, occupation, and long term vision loss. British Journal of Ophthalmology 88, 9 (2004), 1119--1121.
[6]
D. D. Clark-Carter, A. D. Heyes, and C. I. Howarth. 1986. The efficiency and walking speed of visually impaired people. Ergonomics 29, 6 (1986), 779--789.
[7]
W. Erickson, C. Lee, and S. von Schrader. 2012. Disability Statistics from the 2011 American Community Survey (ACS). Retrieved March 14, 2020 from www.disabilitystatistics.org.
[8]
American Printing House for the Blind. 2018. Graphiti. Retrieved March 14, 2020 from http://www.aph.org/graphiti/.
[9]
George A. Gescheider. 2013. Psychophysics: The Fundamentals. Psychology Press.
[10]
George A. Gescheider, John H. Wright, and Ronald T. Verrillo. 2010. Information-Processing Channels in the Tactile Sensory System: A Psychophysical and Physiological Analysis. Psychology Press.
[11]
N. A. Giudice, H. P. Palani, E. Brenner, and K. M. Kramer. 2012. Learning non-visual graphical information using a touch-based vibro-audio interface. In Proceedings of the 14th International ACM SIGACCESS Conference on Computers and Accessibility. ACM, New York, NY, 103--110.
[12]
Bruce Goldstein. 2002. Sensation and Perception (6th ed.). Wadsworth-Thomson Learning, Belmont, CA.
[13]
C. Goncu and K. Marriott. 2011. GraVVITAS: Generic multi-touch presentation of accessible graphics. In Proceedings of the Conference on Human-Computer Interaction. 30--48.
[14]
J. L. Gorlewicz, J. Burgner, T. J. Withrow, and R. J. Webster III. 2014. Initial experiences using vibratory touchscreens to display graphical math concepts to students with visual impairments. Journal of Special Education Technology 29, 2 (2014), 17--25.
[15]
J. L. Gorlewicz, J. L. Tennison, H. P. Palani, and N. A. Giudice. 2018. The graphical access challenge for people with visual impairments: Positions and pathways forward. In Interactive Multimedia. IntechOpen.
[16]
W. Grussenmeyer and E. Folmer. 2017. Accessible touchscreen technology for people with visual impairments: A survey. ACM Transactions on Accessible Computing 9, 2 (2017), Article 6, 31 pages.
[17]
M. Hahn, C. Mueller, and J. L. Gorlewicz. The comprehension of stem graphics via a multi-sensory tablet in students with visual impairment. Journal of Visual Impairment and Blindness. Forthcoming.
[18]
E. Hoggan, S. Anwar, and Stephen A. Brewster. 2007. Mobile multi-actuator tactile displays. In Proceedings of the International Workshop on Haptic and Audio Interaction Design. 22--33.
[19]
E. Hoggan, S. A. Brewster, and J. Johnston. 2008. Investigating the effectiveness of tactile feedback for mobile touchscreens. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. ACM, New York, NY, 1573--1582.
[20]
E. Hoggan, T. Kaaresoja, P. Laitinen, and S. A. Brewster. 2008. Crossmodal congruence: the look, feel and sound of touchscreen widgets. In Proceedings of the 10th International Conference on Multimodal Interfaces. ACM, New York, NY, 157--164.
[21]
Immersion. 2019. Home Page. Retrieved March 14, 2020 from https://www.immersion.com/.
[22]
Apple iOS. 2019. Haptics: User Interaction. Retrieved March 14, 2020 from https://developer.apple.com/design/human-interface-guidelines/ios/user-interaction/haptics/.
[23]
C. Jayant, C. Acuario, W. Johnson, J. Hollier, and R. E. Ladner. 2010. V-braille: Haptic braille perception using a touch-screen and vibration on mobile phones. In Proceedings of the 12th International ACM SIGACCESS Conference on Computers and Accessibility (ASSETS ’10), Vol. 10. 295--296.
[24]
R. S. Johansson and J. R. Flanagan. 2009. Coding and use of tactile signals from the fingertips in object manipulation tasks. Nature Reviews Neuroscience 10, 5 (2009), 345.
[25]
L. A. Jones and S. J. Lederman. 2006. Human Hand Function. Oxford University Press.
[26]
L. A. Jones and N. B. Sarter. 2008. Tactile displays: Guidance for their design and application. Human Factors 50, 1 (2008), 90--111.
[27]
R. L. Klatzky, N. A. Giudice, C. R. Bennett, and J. M. Loomis. 2014. Touch-screen technology for the dynamic display of 2D spatial information without vision: Promise and progress. Multisensory Research 27 (2014), 359--378.
[28]
G. Martino and L. E. Marks. 2000. Cross-modal interaction between vision and touch: The role of synesthetic correspondence. Perception 29, 6 (2000), 745--754.
[29]
M. A. Nees and B. N. Walker. 2009. Auditory Interfaces and Sonification. Georgia Institute of Technology, Atlanta, GA.
[30]
H. Nicolau, J. Guerreiro, T. Guerreiro, and L. Carriço. 2013. UbiBraille: Designing and evaluating a vibrotactile braille-reading device. In Proceedings of the 15th International ACM SIGACCESS Conference on Computers and Accessibility (ASSETS ’13). Article 23.
[31]
S. R. Nyman, M. A. Gosney, and C. R. Victor. 2010. Psychosocial impact of visual impairment in working-age adults. British Journal of Ophthalmology 94, 11 (2010), 1427--1431.
[32]
A. M. Okamura, K. J. Kuchenbecker, and M. Mahvash. 2008. Measurement-based modeling for haptic rendering. In Haptic Rendering. AK Peters/CRC Press, Boca Raton, FL, 440--464.
[33]
S. O’Modhrain, N. A. Giudice, J. A. Gardner, and G. E. Legge. 2015. Designing media for visually-impaired users of refreshable touch display: Possibilities and pitfalls. IEEE Transactions on Haptics 8, 3 (2015), 248--257.
[34]
H. P. Palani and N. A. Giudice. 2014. Evaluation of non-visual panning operations using touch-screen devices. In Proceedings of the 16th International ACM SIGACCESS Conference on Computers and Accessibility. ACM, Los Alamitos, CA, 293--294.
[35]
H. P. Palani, U. Giudice, and N. A. Giudice. 2016. Evaluation of non-visual zooming operations on touchscreen devices. In Proceedings of the International Conference on Universal Access in Human-Computer Interaction. 162--174.
[36]
H. P. Palani, J. L. Tennison, G. B. Giudice, and N. A. Giudice. 2018. Touchscreen-based haptic information access for assisting blind and visually-impaired users: Perceptual parameters and design guidelines. In Proceedings of the International Conference on Applied Human Factors and Ergonomics. 837--847.
[37]
A. E. C. Pensky, K. A. Johnson, S. Haag, and D. Homa. 2008. Delayed memory for visual-haptic exploration of familiar objects. Psychonomic Bulletin 8 Review 15, 3 (2008), 574--580.
[38]
B. Poppinga, C. Magnusson, M. Pielot, and K. Rassmus-Gröhn. 2011. TouchOver map: Audio-tactile exploration of interactive maps. In Proceedings of the 13th International Conference on Human Computer Interaction with Mobile Devices and Services. ACM, New York, NY, 545--550.
[39]
I. Poupyrev, S. Maruyama, and J. Rekimoto. 2002. Ambient touch: Designing tactile interfaces for handheld devices. In Proceedings of the 15th Annual ACM Symposium on User Interface Software and Technology. ACM, New York, NY, 51--60.
[40]
J. Rantala, R. Raisamo, J. Lylykangas, V. Surakka, J. Raisamo, K. Salminen, T. Pakkanen, and A. Hippula. 2009. Methods for presenting braille characters on a mobile device with a touchscreen and tactile feedback. IEEE Transactions on Haptics 2, 1 (2009), 28--39.
[41]
E. Ricciardi, D. Bonino, C. Gentili, L. Sani, P. Pietrini, and T. Vecchi. 2006. Neural correlates of spatial working memory in humans: A functional magnetic resonance imaging study comparing visual and tactile processes. Neuroscience 139, 1 (2006), 339--349.
[42]
W. M. Roth, G. M. Bowen, and M. K. McGinn. 1999. Differences in graph-related practices between high school biology textbooks and scientific ecology journals. Journal of Research in Science Teaching: The Official Journal of the National Association for Research in Science Teaching 36, 9 (1999), 977--1019.
[43]
A. Russomanno, R. B. Gillespie, S. O’Modhrain, and M. Burns. 2015. The design of pressure-controlled valves for a refreshable tactile display. In Proceedings of the 2015 IEEE World Haptics Conference (WHC ’15). IEEE, Los Alamitos, CA, 177--182.
[44]
J. Ryu, J. Jung, G. Park, and S. Choi. 2010. Psychophysical model for vibrotactile rendering in mobile devices. Presence: Teleoperators and Virtual Environments 19, 4 (2010), 364--387.
[45]
Andreas Stefik and Merlin Uesbeck. 2020. AndroidHaptic Library (Java). Retrieved March 14, 2020 from https://bitbucket.org/stefika/androidquorum/src/master/AndroidHaptic/.
[46]
Andreas Stefik and Merlin Uesbeck. 2020. Android Haptic Library (Quorum). Retrieved March 14, 2020 from https://bitbucket.org/stefika/quorum-language/src/master/Quorum/Library/Standard/Libraries/Interface/Vibration/.
[47]
Andreas Stefik and Merlin Uesbeck. 2020. Android Vibration Tutorial. Retrieved March 14, 2020 from https://quorumlanguage.com/tutorials/mobile/vibration.html.
[48]
J. C. Stevens, E. Foulke, and M. Q. Patterson. 1996. Tactile acuity, aging, and braille reading in long-term blindness. Journal of Experimental Psychology: Applied 2, 2 (1996), 91.
[49]
B. A. Swerdfeger, J. Fernquist, T. W. Hazelton, and K. E. MacLean. 2009. Exploring melodic variance in rhythmic haptic stimulus design. In Proceedings of Graphics Interface 2009. 133--140.
[50]
H. Z. Tan, N. I. Durlach, R. M. Reed, and W. M. Rabinowitz. 1999. Information transmission with a multifinger tactual display. Perception 8 Psychophysics 61, 6 (1999), 993--1008.
[51]
Blitab Technology. 2018. Home Page. Retrieved March 14, 2020 from http://blitab.com/.
[52]
J. L. Tennison, Z. S. Carril, N. A. Giudice, and J. L. Gorlewicz. 2018. Comparing haptic pattern matching on tablets and phones: Large screens are not necessarily better. Optometry and Vision Science 95, 9 (2018), 720--726.
[53]
J. L. Tennison and J. L. Gorlewicz. 2016. Toward non-visual graphics representations on vibratory touchscreens: Shape exploration and identification. In Proceedings of the International Conference on Human Haptic Sensing and Touch Enabled Computer Applications. 384--395.
[54]
J. L. Tennison and J. L. Gorlewicz. 2019. Non-visual perception of lines on a multimodal touchscreen tablet. ACM Transactions on Applied Perception 16, 1 (2019), Article 6.
[55]
T. Thesen, J. F. Vibell, G. A. Calvert, and R. A. Österbauer. 2004. Neuroimaging of multisensory processing in vision, audition, touch, and olfaction. Cognitive Processing 5, 2 (2004), 84--93.
[56]
R. W. Van Boven, R. H. Hamilton, T. Kauffman, J. P. Keenan, and A. Pascual-Leone. 2000. Tactile spatial resolution in blind braille readers. Neurology 54, 12 (2000), 2230--2236.
[57]
J. van Erp and M. Spapé. 2003. Distilling the underlying dimensions of tactile melodies. In Proceedings of Eurohaptics, Vol. 2003. 111--120.
[58]
Vital. 2019. Home Page. Retrieved March 14, 2020 from https://www.vital.education/.
[59]
B. N. Walker and J. T. Cothran. 2003. Sonification Sandbox: A Graphical Toolkit for Auditory Graphs. Georgia Institute of Technology.
[60]
B. N. Walker and L. M. Mauney. 2010. Universal design of auditory graphs: A comparison of sonification mappings for visually impaired and sighted listeners. ACM Transactions on Accessible Computing 2, 3 (2010), 12.
[61]
K. Yatani and K. N. Truong. 2009. SemFeel: A user interface with semantic tactile feedback for mobile touch-screen devices. In Proceedings of the 22nd Annual ACM Symposium on User Interface Software and Technology. ACM, New York, NY, 111--120.

Cited By

View all
  • (2024)Bridging the Gap of Graphical Information Accessibility in Education With Multimodal Touchscreens Among Students With Blindness and Low VisionJournal of Visual Impairment & Blindness10.1177/0145482X231217496117:6(453-466)Online publication date: 8-Jan-2024
  • (2024)What Is the User Experience of Eyes-Free Touch Input with Vibrotactile Feedback Decoupled from the Touchscreen?Extended Abstracts of the CHI Conference on Human Factors in Computing Systems10.1145/3613905.3650804(1-8)Online publication date: 11-May-2024
  • (2024)Spatial Audio-Enhanced Multimodal Graph Rendering for Efficient Data Trend Learning on Touchscreen DevicesProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3641959(1-12)Online publication date: 11-May-2024
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Transactions on Applied Perception
ACM Transactions on Applied Perception  Volume 17, Issue 2
April 2020
82 pages
ISSN:1544-3558
EISSN:1544-3965
DOI:10.1145/3399405
Issue’s Table of Contents
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 18 May 2020
Online AM: 07 May 2020
Accepted: 01 February 2020
Revised: 01 January 2020
Received: 01 September 2019
Published in TAP Volume 17, Issue 2

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. HCI
  2. Haptics
  3. perception
  4. touchscreen

Qualifiers

  • Research-article
  • Research
  • Refereed

Funding Sources

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)67
  • Downloads (Last 6 weeks)12
Reflects downloads up to 25 Feb 2025

Other Metrics

Citations

Cited By

View all
  • (2024)Bridging the Gap of Graphical Information Accessibility in Education With Multimodal Touchscreens Among Students With Blindness and Low VisionJournal of Visual Impairment & Blindness10.1177/0145482X231217496117:6(453-466)Online publication date: 8-Jan-2024
  • (2024)What Is the User Experience of Eyes-Free Touch Input with Vibrotactile Feedback Decoupled from the Touchscreen?Extended Abstracts of the CHI Conference on Human Factors in Computing Systems10.1145/3613905.3650804(1-8)Online publication date: 11-May-2024
  • (2024)Spatial Audio-Enhanced Multimodal Graph Rendering for Efficient Data Trend Learning on Touchscreen DevicesProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3641959(1-12)Online publication date: 11-May-2024
  • (2024)Automated Translation of UML Class Diagrams2024 International Conference on Artificial Intelligence, Big Data, Computing and Data Communication Systems (icABCD)10.1109/icABCD62167.2024.10645283(1-6)Online publication date: 1-Aug-2024
  • (2024)The user experience of distal arm-level vibrotactile feedback for interactions with virtual versus physical displaysVirtual Reality10.1007/s10055-024-00977-228:2Online publication date: 22-Mar-2024
  • (2023)Wearable Vibration Device to Assist with Ambulation for the Visually ImpairedBio-inspired Information and Communications Technologies10.1007/978-3-031-43135-7_22(224-236)Online publication date: 25-Sep-2023
  • (2022)Vibration-Based Pattern Password Approach for Visually Impaired PeopleComputer Systems Science and Engineering10.32604/csse.2022.01856340:1(341-356)Online publication date: 2022
  • (2022)Tactile Texture Display Combining Vibrotactile and Electrostatic-friction Stimuli: Substantial Effects on Realism and Moderate Effects on Behavioral ResponsesACM Transactions on Applied Perception10.1145/353973319:4(1-18)Online publication date: 7-Nov-2022
  • (2022)Measuring the User Experience of Vibrotactile Feedback on the Finger, Wrist, and Forearm for Touch Input on Large DisplaysExtended Abstracts of the 2022 CHI Conference on Human Factors in Computing Systems10.1145/3491101.3519704(1-7)Online publication date: 27-Apr-2022
  • (2022)A Comparative Evaluation of Mechanical Vibration and Ultrasonic Vibration on Smartphones in Tactile Code PerceptionIEEE Access10.1109/ACCESS.2022.316752610(41038-41046)Online publication date: 2022
  • Show More Cited By

View Options

Login options

Full Access

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

HTML Format

View this article in HTML Format.

HTML Format

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media