ABSTRACT
In this study, we propose DualBreath, an input method with eight commands using nasal and mouth breathing. To achieve this, we use an intentional timing shift of the nasal and mouth breathing rhythms as input commands. By combining DualBreath with other input methods such as gaze, we can interact without using our hands, including performing graphical user interface operations.
Supplemental Material
- Michael H Cohen, Michael Harris Cohen, James P Giangola, and Jennifer Balogh. 2004. Voice user interface design. Addison-Wesley Professional.Google ScholarDigital Library
- Char Davies and John Harrison. 1996. Osmose: towards broadening the aesthetics of virtual reality. ACM SIGGRAPH Computer Graphics 30, 4 (1996), 25–28.Google ScholarDigital Library
- John Desnoyers-Stewart, Ekaterina R Stepanova, Philippe Pasquier, and Bernhard E Riecke. 2019. JeL: Connecting Through Breath in Virtual Reality. In Extended Abstracts of the 2019 CHI Conference on Human Factors in Computing Systems. 1–6.Google ScholarDigital Library
- Jackson Feijó Filho, Wilson Prata, and Thiago Valle. 2012. Breath mobile: a low-cost software-based breathing controlled mobile phone interface. In Proceedings of the 14th international conference on Human-computer interaction with mobile devices and services companion. 157–160.Google ScholarDigital Library
- Jérémy Frey, May Grabli, Ronit Slyper, and Jessica R Cauchard. 2018. Breeze: Sharing biofeedback through wearable technologies. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems. 1–12.Google ScholarDigital Library
- Rohan Hundia and Aaron Quigley. 2019. BreathIn: A breath pattern sensing approach for user computer interaction. In Proceedings of the 31st Australian Conference on Human-Computer-Interaction. 581–584.Google ScholarDigital Library
- Dhruv Jain, Misha Sra, Jingru Guo, Rodrigo Marques, Raymond Wu, Justin Chiu, and Chris Schmandt. 2016. Immersive terrestrial scuba diving using virtual reality. In Proceedings of the 2016 CHI Conference Extended Abstracts on Human Factors in Computing Systems. 1563–1569.Google ScholarDigital Library
- Chris Lankford. 2000. Effective eye-gaze input into windows. In Proceedings of the 2000 symposium on Eye tracking research & applications. 23–27.Google ScholarDigital Library
- Päivi Majaranta and Kari-Jouko Räihä. 2002. Twenty years of eye typing: systems and design issues. In Proceedings of the 2002 symposium on Eye tracking research & applications. 15–22.Google ScholarDigital Library
- Joe Marshall, Duncan Rowland, Stefan Rennick Egglestone, Steve Benford, Brendan Walker, and Derek McAuley. 2011. Breath control of amusement rides. In Proceedings of the SIGCHI conference on Human Factors in computing systems. 73–82.Google ScholarDigital Library
- Pardis Miri, Emily Jusuf, Andero Uusberg, Horia Margarit, Robert Flory, Katherine Isbister, Keith Marzullo, and James J Gross. 2020. Evaluating a Personalizable, Inconspicuous Vibrotactile (PIV) Breathing Pacer for In-the-Moment Affect Regulation. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems. 1–12.Google ScholarDigital Library
- Shwetak N Patel and Gregory D Abowd. 2007. Blui: low-cost localized blowable user interfaces. In Proceedings of the 20th annual ACM symposium on User interface software and technology. 217–220.Google ScholarDigital Library
- Mirjana Prpa, Ekaterina R Stepanova, Thecla Schiphorst, Bernhard E Riecke, and Philippe Pasquier. 2020. Inhaling and Exhaling: How Technologies Can Perceptually Extend our Breath Awareness. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems. 1–15.Google ScholarDigital Library
- Mirjana Prpa, Kıvanç Tatar, Jules Françoise, Bernhard Riecke, Thecla Schiphorst, and Philippe Pasquier. 2018. Attending to breath: Exploring how the cues in a virtual environment guide the attention to breath and shape the quality of experience to support mindfulness. In Proceedings of the 2018 Designing Interactive Systems Conference. 71–84.Google ScholarDigital Library
- Misha Sra, Xuhai Xu, and Pattie Maes. 2018. Breathvr: Leveraging breathing as a directly controlled interface for virtual reality games. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems. 1–12.Google ScholarDigital Library
- Paul Tennent, Duncan Rowland, Joe Marshall, Stefan Rennick Egglestone, Alexander Harrison, Zachary Jaime, Brendan Walker, and Steve Benford. 2011. Breathalising games: understanding the potential of breath control in game interfaces. In Proceedings of the 8th international conference on advances in computer entertainment technology. 1–8.Google ScholarDigital Library
- Marieke Van Rooij, Adam Lobel, Owen Harris, Niki Smit, and Isabela Granic. 2016. DEEP: A biofeedback virtual reality game for children at-risk for anxiety. In Proceedings of the 2016 CHI Conference Extended Abstracts on Human Factors in Computing Systems. 1989–1997.Google ScholarDigital Library
Recommendations
GazeBreath: Input Method Using Gaze Pointing and Breath Selection
AHs '22: Proceedings of the Augmented Humans International Conference 2022Gaze input is a promising input method that allows intuitive and fast pointing. There are two phases in gaze input: pointing and selection. In the pointing phase, the cursor follows the eye movement to the target object, whereas in the selection phase, ...
NasalBreathInput: A Hands-Free Input Method by Nasal Breath Gestures using a Glasses Type Device
iiWAS2021: The 23rd International Conference on Information Integration and Web IntelligenceResearch on hands-free input methods has been actively conducted. However, most of the previous methods are difficult to use at any time in daily life due to using speech sounds or body movements. In this study, in order to realize a hands-free input ...
Detection and Classification of Sleep-Disordered Breathing Using Acoustic Respiratory Input Impedance and Nasal Pressure
ISMDA '00: Proceedings of the First International Symposium on Medical Data AnalysisWe are developing an algorithm for off-line detection and classification of sleep-disordered breathing based on time series analysis of nasal mask pressure and acoustic respiratory input impedance measured by forced oscillation technique at a frequency ...
Comments