ABSTRACT
When playing musical instruments, deaf and hard-of-hearing (DHH) people typically sense their music from the vibrations transmitted by the instruments or the movements of their bodies while performing. Sensory substitution devices now exist that convert sounds into light and vibrations to support DHH people’s musical activities. However, these devices require specialized hardware, and the marketing profiles assume that standard musical instruments are available. Hence, a significant gap remains between DHH people and their musical performance enjoyment. To address this issue, this study identifies end users’ preferred gestures when using smartphones to emulate the musical experience based on the instrument selected. This gesture elicitation study applies 10 instrument types. Herein, we present the results and a new taxonomy of musical instrument gestures. The findings will support the design of gesture-based instrument interfaces to enable DHH people to more directly enjoy their musical performances.
Supplemental Material
Available for Download
- Christopher R. Austin, Barrett Ens, Kadek Ananta Satriadi, and Bernhard Jenny. 2020. Elicitation study investigating hand and foot gesture interaction for immersive maps in augmented reality. Cartography and Geographic Information Science 47, 3 (2020), 214–228. https://doi.org/10.1080/15230406.2019.1696232 arXiv:https://doi.org/10.1080/15230406.2019.1696232Google ScholarCross Ref
- Amal Dar Aziz, Chris Warren, Hayden Bursk, and Sean Follmer. 2008. The Flote: An Instrument for People with Limited Mobility. In Proceedings of the 10th International ACM SIGACCESS Conference on Computers and Accessibility (Halifax, Nova Scotia, Canada) (Assets ’08). Association for Computing Machinery, New York, NY, USA, 295–296. https://doi.org/10.1145/1414471.1414545Google ScholarDigital Library
- Emeline Brulé. 2016. Playing Music with the Head. In Proceedings of the 18th International ACM SIGACCESS Conference on Computers and Accessibility (Reno, Nevada, USA) (ASSETS ’16). Association for Computing Machinery, New York, NY, USA, 339–340. https://doi.org/10.1145/2982142.2982146Google ScholarDigital Library
- Thisum Buddhika, Haimo Zhang, Samantha W. T. Chan, Vipula Dissanayake, Suranga Nanayakkara, and Roger Zimmermann. 2019. FSense: Unlocking the Dimension of Force for Gestural Interactions Using Smartwatch PPG Sensor. In Proceedings of the 10th Augmented Human International Conference 2019 (Reims, France) (AH2019). Association for Computing Machinery, New York, NY, USA, Article 11, 5 pages. https://doi.org/10.1145/3311823.3311839Google ScholarDigital Library
- Marshall Chasin. 2003. Music and hearing aids. The Hearing Journal 56, 7 (July 2003), 36–38.Google ScholarCross Ref
- Alice-Ann Darrow. 1993. The Role of Music in Deaf Culture: Implications for Music Educators. Journal of Research in Music Education 41, 2 (1993), 93–110. https://doi.org/10.2307/3345402 arXiv:https://doi.org/10.2307/3345402Google ScholarCross Ref
- Nem Khan Dim, Chaklam Silpasuwanchai, Sayan Sarcar, and Xiangshi Ren. 2016. Designing Mid-Air TV Gestures for Blind People Using User- and Choice-Based Elicitation Approaches. In Proceedings of the 2016 ACM Conference on Designing Interactive Systems (Brisbane, QLD, Australia) (DIS ’16). Association for Computing Machinery, New York, NY, USA, 204–214. https://doi.org/10.1145/2901790.2901834Google ScholarDigital Library
- Tilman Dingler, Rufat Rzayev, Alireza Sahami Shirazi, and Niels Henze. 2018. Designing Consistent Gestures Across Device Types: Eliciting RSVP Controls for Phone, Watch, and Glasses. Association for Computing Machinery, New York, NY, USA, 1–12. https://doi.org/10.1145/3173574.3173993Google ScholarDigital Library
- Ward R Drennan and Jay T Rubinstein. 2008. Music perception in cochlear implant users and its relationship with psychophysical capabilities. Journal of rehabilitation research and development 45, 5(2008), 779—789. https://doi.org/10.1682/jrrd.2007.08.0118Google ScholarCross Ref
- Jane L. E, Ilene L. E, James A. Landay, and Jessica R. Cauchard. 2017. Drone & Wo: Cultural Influences on Human-Drone Interaction Techniques. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems (Denver, Colorado, USA) (CHI ’17). Association for Computing Machinery, New York, NY, USA, 6794–6799. https://doi.org/10.1145/3025453.3025755Google ScholarDigital Library
- George Thomas Ealy. 1994. Of ear trumpets and a resonance plate: early hearing aids and Beethoven’s hearing perception. 19th-Century Music 17, 3 (Spring 1994), 262–273.Google Scholar
- Georg Essl. 2010. The Mobile Phone Ensemble As Classroom. In Proceedings of the International Computer Music Conference (ICMC), Stony Brooks/New York.Google Scholar
- Georg Essl and Michael Rohs. 2007. ShaMus – A Sensor-Based Integrated Mobile Phone Instrument. In Proceedings of the International Computer Music Conference (ICMC). 27–31.Google Scholar
- Shariff A. M. Faleel, Michael Gammon, Yumiko Sakamoto, Carlo Menon, and Pourang Irani. 2020. User Gesture Elicitation of Common Smartphone Tasks for Hand Proximate User Interfaces. In Proceedings of the 11th Augmented Human International Conference (Winnipeg, Manitoba, Canada) (AH ’20). Association for Computing Machinery, New York, NY, USA, Article 6, 8 pages. https://doi.org/10.1145/3396339.3396363Google ScholarDigital Library
- David Fourney. 2012. Can Computer Representations of Music Enhance Enjoyment for Individuals Who Are Hard of Hearing?. In Proceedings of the 13th International Conference on Computers Helping People with Special Needs - Volume Part I (Linz, Austria) (ICCHP’12). Springer-Verlag, Berlin, Heidelberg, 535–542. https://doi.org/10.1007/978-3-642-31522-0_80Google ScholarDigital Library
- David W. Fourney. 2015. Making the invisible visible: visualization of music and lyrics for deaf and hard of hearing audiences. https://doi.org/10.32920/ryerson.14664129.v1Google ScholarCross Ref
- David W. Fourney and Deborah I. Fels. 2009. Creating access to music through visualization. In 2009 IEEE Toronto International Conference Science and Technology for Humanity (TIC-STH). 939–944. https://doi.org/10.1109/TIC-STH.2009.5444364Google ScholarCross Ref
- Qian-Jie Fu and John J Galvin. 2007. Computer-Assisted Speech Training for Cochlear Implant Patients: Feasibility, Outcomes, and Future Directions. Seminars in hearing 28, 2 (May 2007). https://doi.org/10.1055/s-2007-973440Google ScholarCross Ref
- John J. Galvin III, Qian-Jie Fu, and Robert V. Shannon. 2009. Melodic Contour Identification and Music Perception by Cochlear Implant Users. Annals of the New York Academy of Sciences 1169, 1 (July 2009), 518–533. https://doi.org/10.1111/j.1749-6632.2009.04551.x arXiv:https://nyaspubs.onlinelibrary.wiley.com/doi/pdf/10.1111/j.1749-6632.2009.04551.xGoogle ScholarCross Ref
- Lalya Gaye, Lars Erik Holmquist, Frauke Behrendt, and Atau Tanaka. 2006. Mobile Music Technology: Report on an Emerging Community. In Proceedings of the 2006 Conference on New Interfaces for Musical Expression (Paris, France) (NIME ’06). IRCAM — Centre Pompidou, Paris, FRA, 22–25.Google Scholar
- Günter Geiger. 2006. Using the Touch Screen as a Controller for Portable Computer Music Instruments. In Proceedings of the 2006 Conference on New Interfaces for Musical Expression (Paris, France) (NIME ’06). IRCAM — Centre Pompidou, Paris, FRA, 61–64.Google ScholarDigital Library
- Nicholas Gillian, Sile O’Modhrain, and Georg Essl. 2009. Scratch-Off : A Gesture Based Mobile Music Game with Tactile Feedback. In Proceedings of the International Conference on New Interfaces for Musical Expression. Zenodo, 308–311. https://doi.org/10.5281/zenodo.1177553Google ScholarCross Ref
- Evelyn Glennie. 2015. Hearing Essay. https://www.evelyn.co.uk/hearing-essay/. (Accessed on 07/10/2022).Google Scholar
- Rumi Hiraga and Kjetil Falkenberg Hansen. 2013. Sound Preferences of Persons with Hearing Loss Playing an Audio-Based Computer Game. In Proceedings of the 3rd ACM International Workshop on Interactive Multimedia on Mobile & Portable Devices (Barcelona, Spain) (IMMPD ’13). Association for Computing Machinery, New York, NY, USA, 25–30. https://doi.org/10.1145/2505483.2505489Google ScholarDigital Library
- Euyshick Hong and Jun Kim. 2017. Webxophone: Web Audio Wind Instrument. In Proceedings of the International Conference on Algorithms, Computing and Systems (Jeju Island, Republic of Korea) (ICACS ’17). Association for Computing Machinery, New York, NY, USA, 79–82. https://doi.org/10.1145/3127942.3127954Google ScholarDigital Library
- Ryo Iijima, Akihisa Shitara, Sayan Sarcar, and Yoichi Ochiai. 2021. Smartphone Drum: Gesture-Based Digital Musical Instruments Application for Deaf and Hard of Hearing People. In Symposium on Spatial User Interaction (Virtual Event, USA) (SUI ’21). Association for Computing Machinery, New York, NY, USA, Article 25, 2 pages. https://doi.org/10.1145/3485279.3488285Google ScholarDigital Library
- Alon Ilsar and Gail Kenning. 2020. Inclusive Improvisation through Sound and Movement Mapping: From DMI to ADMI. In The 22nd International ACM SIGACCESS Conference on Computers and Accessibility(Virtual Event, Greece) (ASSETS ’20). Association for Computing Machinery, New York, NY, USA, Article 49, 8 pages. https://doi.org/10.1145/3373625.3416988Google ScholarDigital Library
- Maria Karam, Carmen Branje, Gabe Nespoli, Norma Thompson, Frank A. Russo, and Deborah I. Fels. 2010. The Emoti-Chair: An Interactive Tactile Music Exhibit. In CHI ’10 Extended Abstracts on Human Factors in Computing Systems (Atlanta, Georgia, USA) (CHI EA ’10). Association for Computing Machinery, New York, NY, USA, 3069–3074. https://doi.org/10.1145/1753846.1753919Google ScholarDigital Library
- Maria Karam, Gabe Nespoli, Frank Russo, and Deborah I. Fels. 2009. Modelling Perceptual Elements of Music in a Vibrotactile Display for Deaf Users: A Field Study. In Proceedings of the 2009 Second International Conferences on Advances in Computer-Human Interactions(ACHI ’09). IEEE Computer Society, USA, 249–254. https://doi.org/10.1109/ACHI.2009.64Google ScholarDigital Library
- Maria Karam, Frank Russo, Carmen Branje, Emily Price, and Deborah I. Fels. 2008. Towards a Model Human Cochlea: Sensory Substitution for Crossmodal Audio-Tactile Displays. In Proceedings of Graphics Interface 2008 (Windsor, Ontario, Canada) (GI ’08). Canadian Information Processing Society, CAN, 267–274.Google Scholar
- Jeeeun Kim, Swamy Ananthanarayan, and Tom Yeh. 2015. Seen Music: Ambient Music Data Visualization for Children with Hearing Impairments. In Proceedings of the 14th International Conference on Interaction Design and Children(Boston, Massachusetts) (IDC ’15). Association for Computing Machinery, New York, NY, USA, 426–429. https://doi.org/10.1145/2771839.2771870Google ScholarDigital Library
- Joy Kim and Jonathan Ricaurte. 2011. TapBeats: Accessible and Mobile Casual Gaming. In The Proceedings of the 13th International ACM SIGACCESS Conference on Computers and Accessibility (Dundee, Scotland, UK) (ASSETS ’11). Association for Computing Machinery, New York, NY, USA, 285–286. https://doi.org/10.1145/2049536.2049609Google ScholarDigital Library
- Bruno La Versa, Isabella Peruzzi, Luca Diamanti, and Marco Zemolin. 2014. MUVIB: Music and Vibration. In Proceedings of the 2014 ACM International Symposium on Wearable Computers: Adjunct Program (Seattle, Washington) (ISWC ’14 Adjunct). Association for Computing Machinery, New York, NY, USA, 65–70. https://doi.org/10.1145/2641248.2641267Google ScholarDigital Library
- Huy Viet Le, Sven Mayer, Maximilian Weiß, Jonas Vogelsang, Henrike Weingärtner, and Niels Henze. 2020. Shortcut Gestures for Mobile Text Editing on Fully Touch Sensitive Smartphones. ACM Trans. Comput.-Hum. Interact. 27, 5, Article 33 (aug 2020), 38 pages. https://doi.org/10.1145/3396233Google ScholarDigital Library
- Charles Lenay, Stephane Canu, and Pierre Villon. 1997. Technology and Perception: The Contribution of Sensory Substitution Systems. In Proceedings of the 2nd International Conference on Cognitive Technology (CT ’97)(CT ’97). IEEE Computer Society, USA, 44.Google ScholarCross Ref
- Yang Kyu Lim and Woon Seung Yeo. 2014. Smartphone-based Music Conducting. In Proceedings of the International Conference on New Interfaces for Musical Expression. Zenodo, 573–576. https://doi.org/10.5281/zenodo.1178851Google ScholarCross Ref
- Charles J. Limb and Alexis T. Roy. 2014. Technological, biological, and acoustical constraints to music perception in cochlear implant users. Hearing Research 308(2014), 13–26. https://doi.org/10.1016/j.heares.2013.04.009 Music: A window into the hearing brain.Google ScholarCross Ref
- Joana Lobo, Soichiro Matsuda, Izumi Futamata, Ryoichi Sakuta, and Kenji Suzuki. 2019. CHIMELIGHT: Augmenting Instruments in Interactive Music Therapy for Children with Neurodevelopmental Disorders. In The 21st International ACM SIGACCESS Conference on Computers and Accessibility (Pittsburgh, PA, USA) (ASSETS ’19). Association for Computing Machinery, New York, NY, USA, 124–135. https://doi.org/10.1145/3308561.3353784Google ScholarDigital Library
- Meethu Malu, Pramod Chundury, and Leah Findlater. 2018. Exploring Accessible Smartwatch Interactions for People with Upper Body Motor Impairments. Association for Computing Machinery, New York, NY, USA, 1–12. https://doi.org/10.1145/3173574.3174062Google ScholarDigital Library
- Dan Mauney, Jonathan Howarth, Andrew Wirtanen, and Miranda Capra. 2010. Cultural Similarities and Differences in User-Defined Gestures for Touchscreen User Interfaces. In CHI ’10 Extended Abstracts on Human Factors in Computing Systems (Atlanta, Georgia, USA) (CHI EA ’10). Association for Computing Machinery, New York, NY, USA, 4015–4020. https://doi.org/10.1145/1753846.1754095Google ScholarDigital Library
- Hugh J. McDermott. 2004. Music Perception with Cochlear Implants: A Review. Trends in Amplification 8, 2 (January 2004), 49–82. https://doi.org/10.1177/108471380400800203 arXiv:https://doi.org/10.1177/108471380400800203PMID: 15497033.Google ScholarCross Ref
- Jorge Mori and Deborah I. Fels. 2009. Seeing the music can animated lyrics provide access to the emotional content in music for people who are deaf or hard of hearing?. In 2009 IEEE Toronto International Conference Science and Technology for Humanity (TIC-STH). 951–956. https://doi.org/10.1109/TIC-STH.2009.5444362Google ScholarCross Ref
- Suranga Nanayakkara, Elizabeth Taylor, Lonce Wyse, and S H. Ong. 2009. An Enhanced Musical Experience for the Deaf: Design and Evaluation of a Music Display and a Haptic Chair. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (Boston, MA, USA) (CHI ’09). Association for Computing Machinery, New York, NY, USA, 337–346. https://doi.org/10.1145/1518701.1518756Google ScholarDigital Library
- Vijayakumar Nanjappan, Rongkai Shi, Hai-Ning Liang, Kim King-Tong Lau, Yong Yue, and Katie Atkinson. 2019. Towards a Taxonomy for In-Vehicle Interactions Using Wearable Smart Textiles: Insights from a User-Elicitation Study. Multimodal Technologies and Interaction 3, 2 (2019). https://doi.org/10.3390/mti3020033Google ScholarCross Ref
- Jieun Oh, Jorge Herrera, Nicholas J. Bryan, Luke Dahl, and Ge Wang. 2010. Evolving The Mobile Phone Orchestra. In Proceedings of the International Conference on New Interfaces for Musical Expression. Zenodo, 82–87. https://doi.org/10.5281/zenodo.1177871Google ScholarCross Ref
- Shotaro Omori and Ikuko Eguchi Yairi. 2013. Collaborative Music Application for Visually Impaired People with Tangible Objects on Table. In Proceedings of the 15th International ACM SIGACCESS Conference on Computers and Accessibility (Bellevue, Washington) (ASSETS ’13). Association for Computing Machinery, New York, NY, USA, Article 42, 2 pages. https://doi.org/10.1145/2513383.2513403Google ScholarDigital Library
- Deysi Helen Ortega, Franceli Linney Cibrian, and Mónica Tentori. 2015. BendableSound: A Fabric-Based Interactive Surface to Promote Free Play in Children with Autism. In Proceedings of the 17th International ACM SIGACCESS Conference on Computers & Accessibility (Lisbon, Portugal) (ASSETS ’15). Association for Computing Machinery, New York, NY, USA, 315–316. https://doi.org/10.1145/2700648.2811355Google ScholarDigital Library
- Mikel Ostiz-Blanco, Alfredo Pina, Miriam Lizaso, Jose Javier Astráin, and Gonzalo Arrondo. 2018. Using the Musical Multimedia Tool ACMUS with People with Severe Mental Disorders: A Pilot Study. In Proceedings of the 20th International ACM SIGACCESS Conference on Computers and Accessibility (Galway, Ireland) (ASSETS ’18). Association for Computing Machinery, New York, NY, USA, 462–464. https://doi.org/10.1145/3234695.3241016Google ScholarDigital Library
- Carol A Padden and Tom Humphries. 1988. Deaf in America. Harvard University Press.Google Scholar
- William Payne, Alex Xu, Amy Hurst, and S. Alex Ruthmann. 2019. Non-Visual Beats: Redesigning the Groove Pizza. In The 21st International ACM SIGACCESS Conference on Computers and Accessibility (Pittsburgh, PA, USA) (ASSETS ’19). Association for Computing Machinery, New York, NY, USA, 651–654. https://doi.org/10.1145/3308561.3354590Google ScholarDigital Library
- Benjamin Petry, Thavishi Illandara, Don Samitha Elvitigala, and Suranga Nanayakkara. 2018. Supporting Rhythm Activities of Deaf Children Using Music-Sensory-Substitution Systems. Association for Computing Machinery, New York, NY, USA, 1–10. https://doi.org/10.1145/3173574.3174060Google ScholarDigital Library
- Benjamin Petry, Thavishi Illandara, and Suranga Nanayakkara. 2016. MuSS-Bits: Sensor-Display Blocks for Deaf People to Explore Musical Sounds. In Proceedings of the 28th Australian Conference on Computer-Human Interaction (Launceston, Tasmania, Australia) (OzCHI ’16). Association for Computing Machinery, New York, NY, USA, 72–80. https://doi.org/10.1145/3010915.3010939Google ScholarDigital Library
- Michael Pouris and Deborah I. Fels. 2012. Creating an Entertaining and Informative Music Visualization. In Proceedings of the 13th International Conference on Computers Helping People with Special Needs - Volume Part I (Linz, Austria) (ICCHP’12). Springer-Verlag, Berlin, Heidelberg, 451–458. https://doi.org/10.1007/978-3-642-31522-0_68Google ScholarDigital Library
- Grazia Ragone. 2020. Designing Embodied Musical Interaction for Children with Autism. In The 22nd International ACM SIGACCESS Conference on Computers and Accessibility (Virtual Event, Greece) (ASSETS ’20). Association for Computing Machinery, New York, NY, USA, Article 104, 4 pages. https://doi.org/10.1145/3373625.3417077Google ScholarDigital Library
- Janine Roebuck. 2007. I am a deaf opera singer. https://www.theguardian.com/theguardian/2007/sep/29/weekend7.weekend2. (Accessed on 07/10/2022).Google Scholar
- Michael Rohs and Georg Essl. 2007. CaMus2: Optical Flow and Collaboration in Camera Phone Music Performance. In Proceedings of the 7th International Conference on New Interfaces for Musical Expression (New York, New York) (NIME ’07). Association for Computing Machinery, New York, NY, USA, 160–163. https://doi.org/10.1145/1279740.1279770Google ScholarDigital Library
- Michael Rohs, Georg Essl, and Martin Roth. 2006. CaMus: Live Music Performance using Camera Phones and Visual Grid Tracking. In Proceedings of the International Conference on New Interfaces for Musical Expression. Zenodo, 31–36. https://doi.org/10.5281/zenodo.1176997Google ScholarCross Ref
- Jaime Ruiz, Yang Li, and Edward Lank. 2011. User-Defined Motion Gestures for Mobile Interaction. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (Vancouver, BC, Canada) (CHI ’11). Association for Computing Machinery, New York, NY, USA, 197–206. https://doi.org/10.1145/1978942.1978971Google ScholarDigital Library
- Greg Schiemer and Mark Havryliv. 2006. Pocket Gamelan: Tuneable Trajectories for Flying Sources in Mandala 3 and Mandala 4. In Proceedings of the 2006 Conference on New Interfaces for Musical Expression (Paris, France) (NIME ’06). IRCAM — Centre Pompidou, Paris, FRA, 37–42.Google Scholar
- Matthias Seuter, Eduardo Rodriguez Macrillante, Gernot Bauer, and Christian Kray. 2018. Running with Drones: Desired Services and Control Gestures. In Proceedings of the 30th Australian Conference on Computer-Human Interaction (Melbourne, Australia) (OzCHI ’18). Association for Computing Machinery, New York, NY, USA, 384–395. https://doi.org/10.1145/3292147.3292156Google ScholarDigital Library
- Bradley Strylowski, Jesse Allison, and Jesse Guessford. 2014. Pitch Canvas: Touchscreen Based Mobile Music Instrument. In Proceedings of the International Conference on New Interfaces for Musical Expression. Zenodo, 171–174. https://doi.org/10.5281/zenodo.1178947Google ScholarCross Ref
- Atau Tanaka. 2004. Mobile Music Making. In Proceedings of the 2004 Conference on New Interfaces for Musical Expression (Hamamatsu, Shizuoka, Japan) (NIME ’04). National University of Singapore, SGP, 154–156.Google ScholarDigital Library
- Stephanie Valencia, Dwayne Lamb, Shane Williams, Harish S. Kulkarni, Ann Paradiso, and Meredith Ringel Morris. 2019. Dueto: Accessible, Gaze-Operated Musical Expression. In The 21st International ACM SIGACCESS Conference on Computers and Accessibility (Pittsburgh, PA, USA) (ASSETS ’19). Association for Computing Machinery, New York, NY, USA, 513–515. https://doi.org/10.1145/3308561.3354603Google ScholarDigital Library
- Maria Varvarigou, Susan Hallam, Andrea Creech, and Hilary McQueen. 2012. Benefits experienced by older people in group music-making activities. Journal of Applied Arts and Health 3 (08 2012), 183–198. https://doi.org/10.1386/jaah.3.2.183_1Google ScholarCross Ref
- Radu-Daniel Vatavu and Jacob O. Wobbrock. 2015. Formalizing Agreement Analysis for Elicitation Studies: New Measures, Significance Test, and Toolkit. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems (Seoul, Republic of Korea) (CHI ’15). Association for Computing Machinery, New York, NY, USA, 1325–1334. https://doi.org/10.1145/2702123.2702223Google ScholarDigital Library
- Santiago Villarreal-Narvaez, Jean Vanderdonckt, Radu-Daniel Vatavu, and Jacob O. Wobbrock. 2020. A Systematic Review of Gesture Elicitation Studies: What Can We Learn from 216 Studies?Association for Computing Machinery, New York, NY, USA, 855–872. https://doi.org/10.1145/3357236.3395511Google ScholarDigital Library
- Panagiotis Vogiatzidakis and Panayiotis Koutsabasis. 2020. Mid-Air Gesture Control of Multiple Home Devices in Spatial Augmented Reality Prototype. Multimodal Technologies and Interaction 4, 3 (2020). https://doi.org/10.3390/mti4030061Google ScholarCross Ref
- Quoc V. Vy, Jorge A. Mori, David W. Fourney, and Deborah I. Fels. 2008. EnACT: A Software Tool for Creating Animated Text Captions. In Computers Helping People with Special Needs, Klaus Miesenberger, Joachim Klaus, Wolfgang Zagler, and Arthur Karshmer (Eds.). Springer Berlin Heidelberg, Berlin, Heidelberg, 609–616.Google Scholar
- Benjamin Walther-Franks, Tanja Döring, Meltem Yilmaz, and Rainer Malaka. 2019. Embodiment or Manipulation? Understanding Users’ Strategies for Free-Hand Character Control. In Proceedings of Mensch Und Computer 2019 (Hamburg, Germany) (MuC’19). Association for Computing Machinery, New York, NY, USA, 661–665. https://doi.org/10.1145/3340764.3344887Google ScholarDigital Library
- Ge Wang. 2014. Ocarina: Designing the iPhone’s Magic Flute. Computer Music Journal 38, 2 (06 2014), 8–21. https://doi.org/10.1162/COMJ_a_00236 arXiv:https://direct.mit.edu/comj/article-pdf/38/2/8/1855988/comj_a_00236.pdfGoogle ScholarDigital Library
- Gil Weinberg, Mark Godfrey, and Andrew Beck. 2010. ZOOZbeat: Mobile Music Recreation. In CHI ’10 Extended Abstracts on Human Factors in Computing Systems (Atlanta, Georgia, USA) (CHI EA ’10). Association for Computing Machinery, New York, NY, USA, 4817–4822. https://doi.org/10.1145/1753846.1754238Google ScholarDigital Library
- Adam S. Williams and Francisco R. Ortega. 2020. Understanding Gesture and Speech Multimodal Interactions for Manipulation Tasks in Augmented Reality Using Unconstrained Elicitation. arxiv:2009.06591 [cs.HC]Google Scholar
- Jacob O. Wobbrock, Htet Htet Aung, Brandon Rothrock, and Brad A. Myers. 2005. Maximizing the Guessability of Symbolic Input. In CHI ’05 Extended Abstracts on Human Factors in Computing Systems (Portland, OR, USA) (CHI EA ’05). Association for Computing Machinery, New York, NY, USA, 1869–1872. https://doi.org/10.1145/1056808.1057043Google ScholarDigital Library
- Jacob O. Wobbrock, Meredith Ringel Morris, and Andrew D. Wilson. 2009. User-Defined Gestures for Surface Computing. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (Boston, MA, USA) (CHI ’09). Association for Computing Machinery, New York, NY, USA, 1083–1092. https://doi.org/10.1145/1518701.1518866Google ScholarDigital Library
- Huiyue Wu, Jinxuan Gai, Yu Wang, Jiayi Liu, Jiali Qiu, Jianmin Wang, and Xiaolong(Luke) Zhang. 2020. Influence of cultural factors on freehand gesture design. International Journal of Human-Computer Studies 143 (2020), 102502. https://doi.org/10.1016/j.ijhcs.2020.102502Google ScholarCross Ref
- Huiyue Wu, Weizhou Luo, Neng Pan, Shenghuan Nan, Yanyi Deng, Shengqian Fu, and Liuqingqing Yang. 2019. Understanding freehand gestures: a study of freehand gestural interaction for immersive VR shopping applications. Human-centric Computing and Information Sciences 9, 1 (2019), 43. https://doi.org/10.1186/s13673-019-0204-7Google ScholarDigital Library
- Huiyue Wu, Yu Wang, Jiayi Liu, Jiali Qiu, and Xiaolong (Luke) Zhang. 2020. User-defined gesture interaction for in-vehicle information systems. Multimedia Tools and Applications 79, 1 (2020), 263–288. https://doi.org/10.1007/s11042-019-08075-1Google ScholarDigital Library
- Ikuko Eguchi Yairi and Takuya Takeda. 2012. A Music Application for Visually Impaired People Using Daily Goods and Stationeries on the Table. In Proceedings of the 14th International ACM SIGACCESS Conference on Computers and Accessibility (Boulder, Colorado, USA) (ASSETS ’12). Association for Computing Machinery, New York, NY, USA, 271–272. https://doi.org/10.1145/2384916.2384988Google ScholarDigital Library
- Hui-Jen Yang, Y.-L. Lay, Yi-Chin Liou, Wen-Yu Tsao, and Cheng-Kun. Lin. 2007. Development and evaluation of computer-aided music-learning system for the hearing impaired. Journal of Computer Assisted Learning 23, 6 (2007), 466–476. https://doi.org/10.1111/j.1365-2729.2007.00229.x arXiv:https://onlinelibrary.wiley.com/doi/pdf/10.1111/j.1365-2729.2007.00229.xGoogle ScholarCross Ref
- Zhican Yang, Chun Yu, Fengshi Zheng, and Yuanchun Shi. 2019. ProxiTalk: Activate Speech Input by Bringing Smartphone to the Mouth. Proc. ACM Interact. Mob. Wearable Ubiquitous Technol. 3, 3, Article 118 (sep 2019), 25 pages. https://doi.org/10.1145/3351276Google ScholarDigital Library
- Yinsheng Zhou, Khe Chai Sim, Patsy Tan, and Ye Wang. 2012. MOGAT: Mobile Games with Auditory Training for Children with Cochlear Implants. In Proceedings of the 20th ACM International Conference on Multimedia (Nara, Japan) (MM ’12). Association for Computing Machinery, New York, NY, USA, 429–438. https://doi.org/10.1145/2393347.2393409Google ScholarDigital Library
Index Terms
- Designing Gestures for Digital Musical Instruments: Gesture Elicitation Study with Deaf and Hard of Hearing People
Recommendations
How people who are deaf, Deaf, and hard of hearing use technology in creative sound activities
ASSETS '22: Proceedings of the 24th International ACM SIGACCESS Conference on Computers and AccessibilityCreative sound activities, such as music playing and audio engineering, are said to have been democratized with the development of technology. Yet, the use of technology in creative sound activities by people who are deaf, Deaf, and hard of hearing (DHH)...
Smartphone Drum: Gesture-based Digital Musical Instruments Application for Deaf and Hard of Hearing People
SUI '21: Proceedings of the 2021 ACM Symposium on Spatial User InteractionSmartphone applications that allow users to enjoy playing musical instruments have emerged, opening up numerous related opportunities. However, it is difficult for deaf and hard of hearing (DHH) people to use these apps because of limited access to ...
An enhanced musical experience for the deaf: design and evaluation of a music display and a haptic chair
CHI '09: Proceedings of the SIGCHI Conference on Human Factors in Computing SystemsMusic is a multi-dimensional experience informed by much more than hearing alone, and is thus accessible to people of all hearing abilities. In this paper we describe a prototype system designed to enrich the experience of music for the deaf by ...
Comments