ABSTRACT
We present PneuFetch, a light haptic cue-based wearable device that supports blind and visually impaired (BVI) people to fetch nearby objects in an unfamiliar environment. In our design, we generate friendly, non-intrusive, and gentle presses and drags to deliver direction and distance cues on BVI user's wrist and forearm. As a concept of proof, we discuss our PneuFetch wearable prototype, contrast it with past work, and describe a preliminary user study.
- Jeffrey P. Bigham, Chandrika Jayant, Andrew Miller, Brandyn White, and Tom Yeh. 2010. VizWiz: LocateIt-enabling blind people to locate objects in their environment. In 2010 IEEE Computer Society Conference on Computer Vision and Pattern Recognition-Workshops, 65--72. DOI: https://doi.org/10.1109/CVPRW.2010.5543821Google ScholarCross Ref
- Ricardo Chincha and YingLi Tian. 2011. Finding objects for blind people based on SURF features. In 2011 IEEE International Conference on Bioinformatics and Biomedicine Workshops (BIBMW), 526--527. DOI: https://doi.org/10.1109/BIBMW.2011.6112423Google ScholarDigital Library
- Alexandra Delazio, Ken Nakagaki, Roberta L. Klatzky, Scott E. Hudson, Jill Fain Lehman, and Alanson P. Sample. 2018. Force Jacket: Pneumatically-Actuated Jacket for Embodied Haptic Experiences. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems (CHI '18), 1--12. DOI: https://doi.org/10.1145/3173574.3173894Google ScholarDigital Library
- Nils Dahlbäck, Arne Jönsson, and Lars Ahrenberg. 1993. Wizard of Oz studies: why and how. In Proceedings of the 1st international conference on Intelligent user interfaces (IUI '93), 193--200. DOI: https://doi.org/10.1145/169891.169968Google ScholarDigital Library
- Ruofei Du and Liang He. 2016. VRSurus: Enhancing Interactivity and Tangibility of Puppets in Virtual Reality. In Proceedings of the 2016 CHI Conference Extended Abstracts on Human Factors in Computing Systems (CHI EA '16), 2454--2461. DOI: https://doi.org/10.1145/2851581.2892290Google ScholarDigital Library
- Anhong Guo, Xiang "Anthony" Chen, Haoran Qi, Samuel White, Suman Ghosh, Chieko Asakawa, and Jeffrey P. Bigham. 2016. VizLens: A Robust and Interactive Screen Reader for Interfaces in the Real World. In Proceedings of the 29th Annual Symposium on User Interface Software and Technology (UIST '16), 651--664. DOI: https://doi.org/10.1145/2984511.2984518Google ScholarDigital Library
- Anhong Guo, Saige McVea, Xu Wang, Patrick Clary, Ken Goldman, Yang Li, Yu Zhong, and Jeffrey P. Bigham. 2018. Investigating Cursorbased Interactions to Support Non-Visual Exploration in the Real World. In Proceedings of the 20th International ACM SIGACCESS Conference on Computers and Accessibility (ASSETS '18), 3--14. DOI: https://doi.org/10.1145/3234695.3236339Google ScholarDigital Library
- Chris Harrison, Shilpa Ramamurthy, and Scott E.Hudson. 2012. On-body Interaction: Armed and Dangerous. In Proceedings of the Sixth International Conference on Tangible, Embedded and Embodied Interaction (TEI '12),69--76. DOI: http://dx.doi.org/10.1145/2148131.2148148Google ScholarDigital Library
- Jonggi Hong, Alisha Pradhan, Jon E. Froehlich, and Leah Findlater. 2017. Evaluating Wrist-Based Haptic Feedback for Non-Visual Target Finding and Path Tracing on a 2D Surface. In Proceedings of the 19th International ACM SIGACCESS Conference on Computers and Accessibility (ASSETS '17), 210--219. DOI: https://doi.org/10.1145/3132525.3132538Google ScholarDigital Library
- Liang He, Cheng Xu, Ding Xu, and Ryan Brill. 2015. PneuHaptic: delivering haptic cues with a pneumatic armband. In Proceedings of the 2015 ACM International Symposium on Wearable Computers (ISWC '15), 47--48. DOI: https://doi.org/10.1145/2802083.2802091Google ScholarDigital Library
- Liang He, Zijian Wan, Leah Findlater, and Jon E. Froehlich. 2017. TacTILE: A Preliminary Toolchain for Creating Accessible Graphics with 3D-Printed Overlays and Auditory Annotations. In Proceedings of the 19th International ACM SIGACCESS Conference on Computers and Accessibility (ASSETS '17), 397--398. DOI: https://doi.org/10.1145/3132525.3134818Google ScholarDigital Library
- Nur Al-huda Hamdan, Adrian Wagner, Simon Voelker, Jürgen Steimle, and Jan Borchers. 2019. Springlets: Expressive, Flexible and Silent On-Skin Tactile Interfaces. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems (CHI '19), 1--14. DOI: https://doi.org/10.1145/3290605.3300718Google ScholarDigital Library
- Teng Han, Qian Han, Michelle Annett, Fraser Anderson, Da-Yuan Huang, and Xing-Dong Yang. 2017. Frictio: Passive Kinesthetic Force Feedback for Smart Ring Output. In Proceedings of the 30th Annual ACM Symposium on User Interface Software and Technology (UIST '17), 131--142. DOI: https://doi.org/10.1145/3126594.3126622Google ScholarDigital Library
- Ali Israr and Ivan Poupyrev. 2011. Tactile brush: drawing on skin with a tactile grid display. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '11), 2019-- 2028. DOI: https://doi.org/10.1145/1978942.1979235Google ScholarDigital Library
- Alexandra Ion, Edward Jay Wang, and Patrick Baudisch. 2015. Skin Drag Displays: Dragging a Physical Tactor across the User's Skin Produces a Stronger Tactile Stimulus than Vibrotactile. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems (CHI '15), 2501--2504. DOI: https://doi.org/10.1145/2702123.2702459Google ScholarDigital Library
- Seungwoo Je, Minkyeong Lee, Yoonji Kim, Liwei Chan, Xing-Dong Yang, and Andrea Bianchi. 2018. PokeRing: Notifications by Poking Around the Finger. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems (CHI '18), 1--10. DOI: https://doi.org/10.1145/3173574.3174116Google ScholarDigital Library
- Slim Kammoun, Christophe Jouffrais, Tiago Guerreiro, Hugo Nicolau, and Joaquim Jorge. 2012. Guiding blind people with haptic feedback. Frontiers in Accessibility for Pervasive Computing (Pervasive 2012) 3.Google Scholar
- Shaun K. Kane, Meredith Ringel Morris, and Jacob O. Wobbrock. 2013. Touchplates: low-cost tactile overlays for visually impaired touch screen users. In Proceedings of the 15th International ACM SIGACCESS Conference on Computers and Accessibility (ASSETS '13), Article 22, 1--8. DOI: https://doi.org/10.1145/2513383.2513442Google ScholarDigital Library
- Topi Kaaresoja and Jukka Linjama. 2005. Perception of short tactile pulses generated by a vibration motor in a mobile phone. In First Joint Eurohaptics Conference and Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems. World Haptics Conference, 471--472. DOI: https://doi.org/10.1109/WHC.2005.103Google ScholarDigital Library
- Orly Lahav and David Mioduser. 2008. Hapticfeedback support for cognitive mapping of unknown spaces by people who are blind. International Journal of Human-Computer Studies 66, no. 1: 23--35. DOI: https://doi.org/10.1016/j.ijhcs.2007.08.001Google ScholarDigital Library
- Orly Lahav, David W. Schloerb, Siddarth Kumar, and Mandayam A. Srinivasan. 2008. BlindAid: A learning environment for enabling people who are blind to explore and navigate through unknown real spaces. In 2008 Virtual Rehabilitation, 193197. DOI: https://doi.org/10.1109/ICVR.2008.4625159Google ScholarCross Ref
- Pedro Lopes, Alexandra Ion, and Patrick Baudisch. 2015. Impacto: Simulating Physical Impact by Combining Tactile Stimulation with Electrical Muscle Stimulation. In Proceedings of the 28th Annual ACM Symposium on User Interface Software & Technology (UIST '15), 11--19. DOI: https://doi.org/10.1145/2807442.2807443Google ScholarDigital Library
- Pedro Lopes, Patrik Jonell, and Patrick Baudisch. 2015. Affordance++: Allowing Objects to Communicate Dynamic Use. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems (CHI '15), 2515--2524. DOI: https://doi.org/10.1145/2702123.2702128Google ScholarDigital Library
- Roshan Lalitha Peiris, Yuan-Ling Feng, Liwei Chan, and Kouta Minamizawa. 2019. ThermalBracelet: Exploring Thermal Haptic Feedback Around the Wrist. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems (CHI '19), 1--11. DOI: https://doi.org/10.1145/3290605.3300400Google ScholarDigital Library
- Boris Schauerte, Manel Martinez, Angela Constantinescu, and Rainer Stiefelhagen. 2012. An assistive vision system for the blind that helps find lost things. In International Conference on Computers for Handicapped Persons, 566--572. DOI: https://doi.org/10.1007/978--3--642--315343_83Google ScholarDigital Library
- Mayuree Srikulwong and Eamonn O'Neill. 2010. A direct experimental comparison of back array and waist-belt tactile interfaces for indicating direction. In Workshop on Multimodal Location Based Techniques for Extreme Navigation at Pervasive, 5--8.Google Scholar
- Misha Sra, Xuhai Xu, and Pattie Maes. 2017. GalVR: a novel collaboration interface using GVS. In Proceedings of the 23rd ACM Symposium on Virtual Reality Software and Technology (VRST '17), Article 61, 1--2. DOI: https://doi.org/10.1145/3139131.3141219Google ScholarDigital Library
- Oliver S. Schneider, Ali Israr, and Karon E. MacLean. 2015. Tactile Animation by Direct Manipulation of Grid Displays. In Proceedings of the 28th Annual ACM Symposium on User Interface Software & Technology (UIST '15), 21-- 30. DOI: https://doi.org/10.1145/2807442.2807470Google ScholarDigital Library
- Rajinder Sodhi, Ivan Poupyrev, Matthew Glisson, and Ali Israr. 2013. AIREAL: interactive tactile experiences in free air. ACM Trans. Graph. 32, 4, Article 134, 10 pages. DOI: https://doi.org/10.1145/2461912.2462007Google ScholarDigital Library
- Xuhai Xu, Haitian Shi, Xin Yi, Wenjia Liu, Yukang Yan, Yuanchun Shi, Alex Mariakakis, Jennifer Mankoff, and Anind K. Dey. 2020. EarBuddy: Enabling On-Face Interaction via Wireless Earbuds. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems (CHI '20), 14. DOI: http://dx.doi.org/10.1145/3313831.3376836Google ScholarDigital Library
- Xuhai Xu, Chun Yu, Anind K. Dey, and Jennifer Mankoff. 2019. Clench Interface: Novel Biting Input Techniques. In Proceedings of the 2019 CHI Conferenceon Human Factors in Computing Systems (CHI '19), Article 275, 12 pages. DOI: http://dx.doi.org/10.1145/3290605.3300505Google ScholarDigital Library
- Eric M. Young, Amirhossein H. Memar, Priyanshu Agarwal, and Nick Colonnese. 2019. Bellowband: A Pneumatic Wristband for Delivering Local Pressure and Vibration. In 2019 IEEE World Haptics Conference (WHC), 55--60. DOI: https://doi.org/10.1109/WHC.2019.8816075Google ScholarCross Ref
Index Terms
- PneuFetch: Supporting Blind and Visually Impaired People to Fetch Nearby Objects via Light Haptic Cues
Recommendations
What Makes Videos Accessible to Blind and Visually Impaired People?
CHI '21: Proceedings of the 2021 CHI Conference on Human Factors in Computing SystemsUser-generated videos are an increasingly important source of information online, yet most online videos are inaccessible to blind and visually impaired (BVI) people. To find videos that are accessible, or understandable without additional description ...
Supporting visually impaired navigation: a needs-finding study
CHI EA '11: CHI '11 Extended Abstracts on Human Factors in Computing SystemsIn this paper, we investigate the requirements for designing systems to support wayfinding for visually impaired individuals. We report the results of an interview study with 20 individuals with visual impairments, asking about their way-finding tools, ...
Facilitating route learning using interactive audio-tactile maps for blind and visually impaired people
CHI EA '13: CHI '13 Extended Abstracts on Human Factors in Computing SystemsIn preparing to navigate in an unfamiliar location, a blind person may use non-visual maps. This project is aimed at developing more effective, interactive audio-tactile maps. The maps will be novel in using speech and non-speech sounds and allowing the ...
Comments