Abstract
Vibration plays a significant role in the way users interact with touchscreens. For many users, vibration affords tactile alerts and other enhancements. For eyes-free users and users with visual impairments, vibration can also serve a more primary role in the user interface, such as indicating streets on maps, conveying information about graphs, or even specifying basic graphics. However, vibration is rarely used in current user interfaces beyond basic cuing. Furthermore, designers and developers who do actually use vibration more extensively are often unable to determine the exact properties of the vibration signals they are implementing, due to out-of-the-box software and hardware limitations. We make two contributions in this work. First, we investigate the contextual properties of touchscreen vibrations and how vibrations can be used to effectively convey traditional, embossed elements, such as dashes and dots. To do so, we developed an open source, Android-based library to generate vibrations that are perceptually salient and intuitive, improving upon existing vibration libraries. Second, we conducted a user study with 26 blind or visually impaired users to evaluate and categorize the effects with respect to traditional tactile line profiles. We have established a range of vibration effects that can be reliably generated by our haptic library and are perceptible and distinguishable by users.
- Zakaria Al-Qudah, Iyad Abu Doush, Faisal Alkhateeb, Esalm Al Maghayreh, and Osama Al-Khaleel. 2011. Reading braille on mobile phones: A fast method with low battery power consumption. In Proceedings of the 2011 International Conference on User Science and Engineering (i-USEr ’11). IEEE, Los Alamitos, CA, 118--123.Google ScholarCross Ref
- Google Android. 2019. Vibrator. Retrieved March 14, 2020 from https://developer.android.com/reference/android/os/Vibrator.Google Scholar
- Pew Research Center. 2019. Mobile Fact Sheet: Demographics of Mobile Device Ownership and Adoption in the United States. Retrieved March 14, 2020 from https://www.pewinternet.org/fact-sheet/mobile/.Google Scholar
- S. Choi and K. J. Kuchenbecker. 2013. Vibrotactile display: Perception, technology, and applications. Proceedings of the IEEE 101, 9 (2013), 2093--2104.Google ScholarCross Ref
- B. Chua and P. Mitchell. 2004. Consequences of amblyopia on education, occupation, and long term vision loss. British Journal of Ophthalmology 88, 9 (2004), 1119--1121.Google ScholarCross Ref
- D. D. Clark-Carter, A. D. Heyes, and C. I. Howarth. 1986. The efficiency and walking speed of visually impaired people. Ergonomics 29, 6 (1986), 779--789.Google ScholarCross Ref
- W. Erickson, C. Lee, and S. von Schrader. 2012. Disability Statistics from the 2011 American Community Survey (ACS). Retrieved March 14, 2020 from www.disabilitystatistics.org.Google Scholar
- American Printing House for the Blind. 2018. Graphiti. Retrieved March 14, 2020 from http://www.aph.org/graphiti/.Google Scholar
- George A. Gescheider. 2013. Psychophysics: The Fundamentals. Psychology Press.Google ScholarCross Ref
- George A. Gescheider, John H. Wright, and Ronald T. Verrillo. 2010. Information-Processing Channels in the Tactile Sensory System: A Psychophysical and Physiological Analysis. Psychology Press.Google Scholar
- N. A. Giudice, H. P. Palani, E. Brenner, and K. M. Kramer. 2012. Learning non-visual graphical information using a touch-based vibro-audio interface. In Proceedings of the 14th International ACM SIGACCESS Conference on Computers and Accessibility. ACM, New York, NY, 103--110.Google Scholar
- Bruce Goldstein. 2002. Sensation and Perception (6th ed.). Wadsworth-Thomson Learning, Belmont, CA.Google Scholar
- C. Goncu and K. Marriott. 2011. GraVVITAS: Generic multi-touch presentation of accessible graphics. In Proceedings of the Conference on Human-Computer Interaction. 30--48.Google Scholar
- J. L. Gorlewicz, J. Burgner, T. J. Withrow, and R. J. Webster III. 2014. Initial experiences using vibratory touchscreens to display graphical math concepts to students with visual impairments. Journal of Special Education Technology 29, 2 (2014), 17--25.Google ScholarCross Ref
- J. L. Gorlewicz, J. L. Tennison, H. P. Palani, and N. A. Giudice. 2018. The graphical access challenge for people with visual impairments: Positions and pathways forward. In Interactive Multimedia. IntechOpen.Google Scholar
- W. Grussenmeyer and E. Folmer. 2017. Accessible touchscreen technology for people with visual impairments: A survey. ACM Transactions on Accessible Computing 9, 2 (2017), Article 6, 31 pages.Google ScholarDigital Library
- M. Hahn, C. Mueller, and J. L. Gorlewicz. The comprehension of stem graphics via a multi-sensory tablet in students with visual impairment. Journal of Visual Impairment and Blindness. Forthcoming.Google Scholar
- E. Hoggan, S. Anwar, and Stephen A. Brewster. 2007. Mobile multi-actuator tactile displays. In Proceedings of the International Workshop on Haptic and Audio Interaction Design. 22--33.Google Scholar
- E. Hoggan, S. A. Brewster, and J. Johnston. 2008. Investigating the effectiveness of tactile feedback for mobile touchscreens. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. ACM, New York, NY, 1573--1582.Google Scholar
- E. Hoggan, T. Kaaresoja, P. Laitinen, and S. A. Brewster. 2008. Crossmodal congruence: the look, feel and sound of touchscreen widgets. In Proceedings of the 10th International Conference on Multimodal Interfaces. ACM, New York, NY, 157--164.Google Scholar
- Immersion. 2019. Home Page. Retrieved March 14, 2020 from https://www.immersion.com/.Google Scholar
- Apple iOS. 2019. Haptics: User Interaction. Retrieved March 14, 2020 from https://developer.apple.com/design/human-interface-guidelines/ios/user-interaction/haptics/.Google Scholar
- C. Jayant, C. Acuario, W. Johnson, J. Hollier, and R. E. Ladner. 2010. V-braille: Haptic braille perception using a touch-screen and vibration on mobile phones. In Proceedings of the 12th International ACM SIGACCESS Conference on Computers and Accessibility (ASSETS ’10), Vol. 10. 295--296.Google Scholar
- R. S. Johansson and J. R. Flanagan. 2009. Coding and use of tactile signals from the fingertips in object manipulation tasks. Nature Reviews Neuroscience 10, 5 (2009), 345.Google ScholarCross Ref
- L. A. Jones and S. J. Lederman. 2006. Human Hand Function. Oxford University Press.Google Scholar
- L. A. Jones and N. B. Sarter. 2008. Tactile displays: Guidance for their design and application. Human Factors 50, 1 (2008), 90--111.Google ScholarCross Ref
- R. L. Klatzky, N. A. Giudice, C. R. Bennett, and J. M. Loomis. 2014. Touch-screen technology for the dynamic display of 2D spatial information without vision: Promise and progress. Multisensory Research 27 (2014), 359--378.Google ScholarCross Ref
- G. Martino and L. E. Marks. 2000. Cross-modal interaction between vision and touch: The role of synesthetic correspondence. Perception 29, 6 (2000), 745--754.Google ScholarCross Ref
- M. A. Nees and B. N. Walker. 2009. Auditory Interfaces and Sonification. Georgia Institute of Technology, Atlanta, GA.Google Scholar
- H. Nicolau, J. Guerreiro, T. Guerreiro, and L. Carriço. 2013. UbiBraille: Designing and evaluating a vibrotactile braille-reading device. In Proceedings of the 15th International ACM SIGACCESS Conference on Computers and Accessibility (ASSETS ’13). Article 23.Google Scholar
- S. R. Nyman, M. A. Gosney, and C. R. Victor. 2010. Psychosocial impact of visual impairment in working-age adults. British Journal of Ophthalmology 94, 11 (2010), 1427--1431.Google ScholarCross Ref
- A. M. Okamura, K. J. Kuchenbecker, and M. Mahvash. 2008. Measurement-based modeling for haptic rendering. In Haptic Rendering. AK Peters/CRC Press, Boca Raton, FL, 440--464.Google Scholar
- S. O’Modhrain, N. A. Giudice, J. A. Gardner, and G. E. Legge. 2015. Designing media for visually-impaired users of refreshable touch display: Possibilities and pitfalls. IEEE Transactions on Haptics 8, 3 (2015), 248--257.Google ScholarDigital Library
- H. P. Palani and N. A. Giudice. 2014. Evaluation of non-visual panning operations using touch-screen devices. In Proceedings of the 16th International ACM SIGACCESS Conference on Computers and Accessibility. ACM, Los Alamitos, CA, 293--294.Google Scholar
- H. P. Palani, U. Giudice, and N. A. Giudice. 2016. Evaluation of non-visual zooming operations on touchscreen devices. In Proceedings of the International Conference on Universal Access in Human-Computer Interaction. 162--174.Google Scholar
- H. P. Palani, J. L. Tennison, G. B. Giudice, and N. A. Giudice. 2018. Touchscreen-based haptic information access for assisting blind and visually-impaired users: Perceptual parameters and design guidelines. In Proceedings of the International Conference on Applied Human Factors and Ergonomics. 837--847.Google Scholar
- A. E. C. Pensky, K. A. Johnson, S. Haag, and D. Homa. 2008. Delayed memory for visual-haptic exploration of familiar objects. Psychonomic Bulletin 8 Review 15, 3 (2008), 574--580.Google Scholar
- B. Poppinga, C. Magnusson, M. Pielot, and K. Rassmus-Gröhn. 2011. TouchOver map: Audio-tactile exploration of interactive maps. In Proceedings of the 13th International Conference on Human Computer Interaction with Mobile Devices and Services. ACM, New York, NY, 545--550.Google Scholar
- I. Poupyrev, S. Maruyama, and J. Rekimoto. 2002. Ambient touch: Designing tactile interfaces for handheld devices. In Proceedings of the 15th Annual ACM Symposium on User Interface Software and Technology. ACM, New York, NY, 51--60.Google Scholar
- J. Rantala, R. Raisamo, J. Lylykangas, V. Surakka, J. Raisamo, K. Salminen, T. Pakkanen, and A. Hippula. 2009. Methods for presenting braille characters on a mobile device with a touchscreen and tactile feedback. IEEE Transactions on Haptics 2, 1 (2009), 28--39.Google ScholarDigital Library
- E. Ricciardi, D. Bonino, C. Gentili, L. Sani, P. Pietrini, and T. Vecchi. 2006. Neural correlates of spatial working memory in humans: A functional magnetic resonance imaging study comparing visual and tactile processes. Neuroscience 139, 1 (2006), 339--349.Google ScholarCross Ref
- W. M. Roth, G. M. Bowen, and M. K. McGinn. 1999. Differences in graph-related practices between high school biology textbooks and scientific ecology journals. Journal of Research in Science Teaching: The Official Journal of the National Association for Research in Science Teaching 36, 9 (1999), 977--1019.Google Scholar
- A. Russomanno, R. B. Gillespie, S. O’Modhrain, and M. Burns. 2015. The design of pressure-controlled valves for a refreshable tactile display. In Proceedings of the 2015 IEEE World Haptics Conference (WHC ’15). IEEE, Los Alamitos, CA, 177--182.Google Scholar
- J. Ryu, J. Jung, G. Park, and S. Choi. 2010. Psychophysical model for vibrotactile rendering in mobile devices. Presence: Teleoperators and Virtual Environments 19, 4 (2010), 364--387.Google ScholarDigital Library
- Andreas Stefik and Merlin Uesbeck. 2020. AndroidHaptic Library (Java). Retrieved March 14, 2020 from https://bitbucket.org/stefika/androidquorum/src/master/AndroidHaptic/.Google Scholar
- Andreas Stefik and Merlin Uesbeck. 2020. Android Haptic Library (Quorum). Retrieved March 14, 2020 from https://bitbucket.org/stefika/quorum-language/src/master/Quorum/Library/Standard/Libraries/Interface/Vibration/.Google Scholar
- Andreas Stefik and Merlin Uesbeck. 2020. Android Vibration Tutorial. Retrieved March 14, 2020 from https://quorumlanguage.com/tutorials/mobile/vibration.html.Google Scholar
- J. C. Stevens, E. Foulke, and M. Q. Patterson. 1996. Tactile acuity, aging, and braille reading in long-term blindness. Journal of Experimental Psychology: Applied 2, 2 (1996), 91.Google ScholarCross Ref
- B. A. Swerdfeger, J. Fernquist, T. W. Hazelton, and K. E. MacLean. 2009. Exploring melodic variance in rhythmic haptic stimulus design. In Proceedings of Graphics Interface 2009. 133--140.Google Scholar
- H. Z. Tan, N. I. Durlach, R. M. Reed, and W. M. Rabinowitz. 1999. Information transmission with a multifinger tactual display. Perception 8 Psychophysics 61, 6 (1999), 993--1008.Google Scholar
- Blitab Technology. 2018. Home Page. Retrieved March 14, 2020 from http://blitab.com/.Google Scholar
- J. L. Tennison, Z. S. Carril, N. A. Giudice, and J. L. Gorlewicz. 2018. Comparing haptic pattern matching on tablets and phones: Large screens are not necessarily better. Optometry and Vision Science 95, 9 (2018), 720--726.Google ScholarCross Ref
- J. L. Tennison and J. L. Gorlewicz. 2016. Toward non-visual graphics representations on vibratory touchscreens: Shape exploration and identification. In Proceedings of the International Conference on Human Haptic Sensing and Touch Enabled Computer Applications. 384--395.Google Scholar
- J. L. Tennison and J. L. Gorlewicz. 2019. Non-visual perception of lines on a multimodal touchscreen tablet. ACM Transactions on Applied Perception 16, 1 (2019), Article 6.Google ScholarDigital Library
- T. Thesen, J. F. Vibell, G. A. Calvert, and R. A. Österbauer. 2004. Neuroimaging of multisensory processing in vision, audition, touch, and olfaction. Cognitive Processing 5, 2 (2004), 84--93.Google ScholarCross Ref
- R. W. Van Boven, R. H. Hamilton, T. Kauffman, J. P. Keenan, and A. Pascual-Leone. 2000. Tactile spatial resolution in blind braille readers. Neurology 54, 12 (2000), 2230--2236.Google ScholarCross Ref
- J. van Erp and M. Spapé. 2003. Distilling the underlying dimensions of tactile melodies. In Proceedings of Eurohaptics, Vol. 2003. 111--120.Google Scholar
- Vital. 2019. Home Page. Retrieved March 14, 2020 from https://www.vital.education/.Google Scholar
- B. N. Walker and J. T. Cothran. 2003. Sonification Sandbox: A Graphical Toolkit for Auditory Graphs. Georgia Institute of Technology.Google Scholar
- B. N. Walker and L. M. Mauney. 2010. Universal design of auditory graphs: A comparison of sonification mappings for visually impaired and sighted listeners. ACM Transactions on Accessible Computing 2, 3 (2010), 12.Google ScholarDigital Library
- K. Yatani and K. N. Truong. 2009. SemFeel: A user interface with semantic tactile feedback for mobile touch-screen devices. In Proceedings of the 22nd Annual ACM Symposium on User Interface Software and Technology. ACM, New York, NY, 111--120.Google Scholar
Index Terms
- Establishing Vibration-Based Tactile Line Profiles for Use in Multimodal Graphics
Recommendations
Non-visual Perception of Lines on a Multimodal Touchscreen Tablet
While text-to-speech software has largely made textual information accessible in the digital space, analogous access to graphics still remains an unsolved problem. Because of their portability and ubiquity, several studies have alluded to touchscreens ...
Enhancing physicality in touch interaction with programmable friction
CHI '11: Proceedings of the SIGCHI Conference on Human Factors in Computing SystemsTouch interactions have refreshed some of the 'glowing enthusiasm' of thirty years ago for direct manipulation interfaces. However, today's touch technologies, whose interactions are supported by graphics, sounds or crude clicks, have a tactile sameness ...
Contact Location Display for Haptic Perception of Curvature and Object Motion
We present a new tactile display for use in dexterous telemanipulation and virtual reality. Our system renders the location of the contact centroid moving on the user's fingertip. Constructed in a thimble-sized package and mounted on a haptic force-...
Comments