ABSTRACT
We propose LipNotif, a non-contact tactile notification system that uses airborne ultrasound tactile presentation to lips. Lips are suitable for non-contact tactile notifications because they have high tactile sensitivity comparable to the palms, are less occupied in daily life, and are constantly exposed outward. LipNotif uses tactile patterns to intuitively convey information to users, allowing them to receive notifications using only their lips, without sight, hearing, or hands. We developed a prototype system that automatically recognizes the position of the lips and presents non-contact tactile sensations. Two experiments were conducted to evaluate the feasibility of LipNotif. In the first experiment, we found that directional information can be notified to the lips with an average accuracy of ± 11.1° in the 120° horizontal range. In the second experiment, we could elicit significantly different affective responses by changing the stimulus intensity. The experimental results indicated that LipNotif is practical for conveying directions, emotions, and combinations of them. LipNotif can be applied for various purposes, such as notifications during work, calling in the waiting room, and tactile feedback in automotive user interfaces.
Supplemental Material
- Tomohiro Amemiya and Hisashi Sugiyama. 2009. Navigation in Eight Cardinal Directions with Pseudo-Attraction Force for the Visually Impaired. In 2009 IEEE International Conference on Systems, Man and Cybernetics. IEEE, 27–32.Google Scholar
- Kentaro Ariga, Masahiro Fujiwara, Yasutoshi Makino, and Hiroyuki Shinoda. 2020. Midair Haptic Presentation Using Concave Reflector. In International Conference on Human Haptic Sensing and Touch Enabled Computer Applications. Springer, 307–315.Google Scholar
- Karlene K Ball, Bettina L Beard, Daniel L Roenker, Richard L Miller, and David S Griggs. 1988. Age and visual search: Expanding the useful field of view. JOSA A 5, 12 (1988), 2210–2219.Google ScholarCross Ref
- Tadas Baltrusaitis, Amir Zadeh, Yao Chong Lim, and Louis-Philippe Morency. 2018. OpenFace 2.0: Facial Behavior Analysis Toolkit. In 2018 13th IEEE International Conference on Automatic Face & Gesture Recognition (FG 2018). IEEE, 59–66.Google ScholarDigital Library
- Steven M Barlow. 1987. Mechanical Frequency Detection Thresholds in the Human Face. Experimental neurology 96, 2 (1987), 253–261.Google Scholar
- Margaret M Bradley and Peter J Lang. 1994. Measuring emotion: The self-assessment manikin and the semantic differential. Journal of behavior therapy and experimental psychiatry 25, 1(1994), 49–59.Google ScholarCross Ref
- Stephen A Brewster and Lorna M Brown. 2004. Tactons: Structured Tactile Messages for Non-Visual Information Display. (2004).Google Scholar
- Tom Carter, Sue Ann Seah, Benjamin Long, Bruce Drinkwater, and Sriram Subramanian. 2013. UltraHaptics: Multi-Point Mid-Air Haptic Feedback for Touch Surfaces. In Proceedings of the 26th Annual ACM Symposium on User Interface Software and Technology. 505–514.Google ScholarDigital Library
- Piotr Dalka and Andrzej Czyzewski. 2010. Human-Computer Interface Based on Visual Lip Movement and Gesture Recognition. Int. J. Comput. Sci. Appl. 7, 3 (2010), 124–139.Google Scholar
- Mohamad A Eid and Hussein Al Osman. 2015. Affective Haptics: Current Research and Future Directions. IEEE Access 4(2015), 26–40.Google ScholarCross Ref
- William Frier, Damien Ablart, Jamie Chilles, Benjamin Long, Marcello Giordano, Marianna Obrist, and Sriram Subramanian. 2018. Using Spatiotemporal Modulation to Draw Tactile Patterns in Mid-air. In International Conference on Human Haptic Sensing and Touch Enabled Computer Applications. Springer, 270–281.Google ScholarCross Ref
- Jay D Fuletra and Dulari Bosamiya. 2013. A Survey on Drivers Drowsiness Detection Techniques. International Journal on Recent and Innovation Trends in Computing and Communication 1, 11 (2013), 816–819.Google Scholar
- Orestis Georgiou, Valerio Biscione, Adam Harwood, Daniel Griffiths, Marcello Giordano, Ben Long, and Tom Carter. 2017. Haptic In-Vehicle Gesture Controls. In Proceedings of the 9th International Conference on Automotive User Interfaces and Interactive Vehicular Applications Adjunct. 233–238.Google ScholarDigital Library
- Hyunjae Gil, Hyungki Son, Jin Ryong Kim, and Ian Oakley. 2018. Whiskers: Exploring the Use of Ultrasonic Haptic Cues on the Face. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems. 1–13.Google ScholarDigital Library
- Aakar Gupta, Antony Albert Raj Irudayaraj, and Ravin Balakrishnan. 2017. HapticClench: Investigating Squeeze Sensations Using Memory Alloys. In Proceedings of the 30th Annual ACM Symposium on User Interface Software and Technology. 109–117.Google ScholarDigital Library
- Sidhant Gupta, Dan Morris, Shwetak N Patel, and Desney Tan. 2013. AirWave: Non-Contact Haptic Feedback Using Air Vortex Rings. In Proceedings of the 2013 ACM International Joint Conference on Pervasive and Ubiquitous Computing. 419–428.Google ScholarDigital Library
- Kyle Harrington, David R Large, Gary Burnett, and Orestis Georgiou. 2018. Exploring the Use of Mid-Air Ultrasonic Feedback to Enhance Automotive User Interfaces. In Proceedings of the 10th international conference on automotive user interfaces and interactive vehicular applications. 11–20.Google ScholarDigital Library
- Keisuke Hasegawa and Hiroyuki Shinoda. 2018. Aerial Vibrotactile Display Based on Multiunit Ultrasound Phased Array. IEEE Transactions on Haptics 11, 3 (2018), 367–377.Google ScholarCross Ref
- Yuki Hashimoto, Naohisa Nagaya, Minoru Kojima, Satoru Miyajima, Junichiro Ohtaki, Akio Yamamoto, Tomoyasu Mitani, and Masahiko Inami. 2006. Straw-like User Interface: Virtual experience of the sensation of drinking using a straw. In Proceedings of the 2006 ACM SIGCHI International Conference on Advances in Computer Entertainment Technology. 50–es.Google ScholarDigital Library
- Takayuki Hoshi. 2017. Noise Reduction of Airborne Ultrasound Tactile Display on Moving Stimulation Point. In Trans. Virtual Reality Soc. Japan, Vol. 22. 293–300.Google Scholar
- Takayuki Hoshi, Masafumi Takahashi, Takayuki Iwamoto, and Hiroyuki Shinoda. 2010. Noncontact Tactile Display Based on Radiation Pressure of Airborne Ultrasound. IEEE Transactions on Haptics 3, 3 (2010), 155–165.Google ScholarDigital Library
- Alexandra Ion, Edward Jay Wang, and Patrick Baudisch. 2015. Skin Drag Displays: Dragging a Physical Tactor across the User’s Skin Produces a Stronger Tactile Stimulus than Vibrotactile. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems. 2501–2504.Google ScholarDigital Library
- Takayuki Iwamoto, Mari Tatezono, and Hiroyuki Shinoda. 2008. Non-contact Method for Producing Tactile Sensation Using Airborne Ultrasound. In International Conference on Human Haptic Sensing and Touch Enabled Computer Applications. Springer, 504–513.Google Scholar
- Seungwoo Je, Minkyeong Lee, Yoonji Kim, Liwei Chan, Xing-Dong Yang, and Andrea Bianchi. 2018. PokeRing: Notifications by Poking Around the Finger. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems. 1–10.Google ScholarDigital Library
- Seungwoo Je, Brendan Rooney, Liwei Chan, and Andrea Bianchi. 2017. tactoRing: A Skin-Drag Discrete Display. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems. 3106–3114.Google ScholarDigital Library
- Arata Jingu, Masahiro Fujiwara, Yasutoshi Makino, and Hiroyuki Shionda. 2021. Tactile Perception Characteristics of Lips Stimulated by Airborne Ultrasound. In 2021 IEEE World Haptics Conference (WHC).Google ScholarCross Ref
- Takaaki Kamigaki, Akihito Noda, and Hiroyuki Shinoda. 2017. Thin and Flexible Airborne Ultrasound Phased Array for Tactile Display. In 2017 56th Annual Conference of the Society of Instrument and Control Engineers of Japan (SICE). IEEE, 736–739.Google Scholar
- Ariga Kentaro, Masahiro Fujiwara, Yasutoshi Makino, and Hiroyuki Shionda. 2021. Workspace Evaluation of Long-Distance Midair Haptic Display Using Curved Reflector. In 2021 IEEE World Haptics Conference (WHC).Google Scholar
- Davis E. King. 2009. Dlib-ml: A Machine Learning Toolkit. Journal of Machine Learning Research 10 (2009), 1755–1758.Google ScholarDigital Library
- Hojin Lee, Ji-Sun Kim, Seungmoon Choi, Jae-Hoon Jun, Jong-Rak Park, A-Hee Kim, Han-Byeol Oh, Hyung-Sik Kim, and Soon-Cheol Chung. 2015. Mid-Air Tactile Stimulation Using Laser-Induced Thermoelastic Effects: The First Study for Indirect Radiation. In 2015 IEEE World Haptics Conference (WHC). IEEE, 374–380.Google Scholar
- Jaeyeon Lee and Geehyuk Lee. 2016. Designing a Non-contact Wearable Tactile Display Using Airflows. In Proceedings of the 29th Annual Symposium on User Interface Software and Technology. 183–194.Google ScholarDigital Library
- Minkyeong Lee, Seungwoo Je, Woojin Lee, Daniel Ashbrook, and Andrea Bianchi. 2019. ActivEarring: Spatiotemporal Haptic Cues on the Ears. IEEE Transactions on Haptics 12, 4 (2019), 554–562.Google ScholarDigital Library
- Seungyon” Claire” Lee and Thad Starner. 2010. BuzzWear: Alert Perception in Wearable Tactile Displays on the Wrist. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. 433–442.Google ScholarDigital Library
- Wei Liu and Hui Tang. 2005. An initial study on lip perception of electrotactile array stimulation.Journal of Rehabilitation Research & Development 42, 5(2005).Google Scholar
- Line S Löken, Johan Wessberg, Francis McGlone, and Håkan Olausson. 2009. Coding of pleasant touch by unmyelinated afferents in humans. Nature neuroscience 12, 5 (2009), 547–548.Google Scholar
- Flavia Mancini, Armando Bauleo, Jonathan Cole, Fausta Lui, Carlo A Porro, Patrick Haggard, and Gian Domenico Iannetti. 2014. Whole-Body Mapping of Spatial Acuity for Pain and Touch. Annals of neurology 75, 6 (2014), 917–924.Google ScholarCross Ref
- Saya Mizutani, Masahiro Fujiwara, Yasutoshi Makino, and Hiroyuki Shinoda. 2019. Thresholds of Haptic and Auditory Perception in Midair Facial Stimulation. In 2019 IEEE International Symposium on Haptic, Audio and Visual Environments and Games (HAVE). IEEE, 1–6.Google Scholar
- Yasuaki Monnai, Keisuke Hasegawa, Masahiro Fujiwara, Kazuma Yoshino, Seki Inoue, and Hiroyuki Shinoda. 2014. HaptoMime: Mid-Air Haptic Interaction with a Floating Virtual Screen. In Proceedings of the 27th Annual ACM Symposium on User Interface Software and Technology. 663–667.Google ScholarDigital Library
- Thomas R Nelson, J Brian Fowlkes, Jacques S Abramowicz, and Charles C Church. 2009. Ultrasound Biosafety Considerations for the Practicing Sonographer and Sonologist. (2009).Google Scholar
- Mohd Adili Norasikin, Diego Martinez Plasencia, Spyros Polychronopoulos, Gianluca Memoli, Yutaka Tokuda, and Sriram Subramanian. 2018. SoundBender: Dynamic Acoustic Control Behind Obstacles. In Proceedings of the 31st Annual ACM Symposium on User Interface Software and Technology. 247–259.Google ScholarDigital Library
- Marianna Obrist, Sriram Subramanian, Elia Gatti, Benjamin Long, and Thomas Carter. 2015. Emotions Mediated Through Mid-Air Haptics. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems. 2053–2062.Google ScholarDigital Library
- Viktorija Paneva, Sofia Seinfeld, Michael Kraiczi, and Jörg Müller. 2020. HaptiRead: Reading Braille as Mid-Air Haptic Information. Association for Computing Machinery, New York, NY, USA, 13–20. https://doi.org/10.1145/3357236.3395515Google Scholar
- Roshan Lalitha Peiris, Yuan-Ling Feng, Liwei Chan, and Kouta Minamizawa. 2019. ThermalBracelet: Exploring Thermal Haptic Feedback Around the Wrist. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems. 1–11.Google ScholarDigital Library
- Henning Pohl, Peter Brandes, Hung Ngo Quang, and Michael Rohs. 2017. Squeezeback: Pneumatic Compression for Notifications. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems. 5318–5330.Google ScholarDigital Library
- Ismo Rakkolainen, Euan Freeman, Antti Sand, Roope Raisamo, and Stephen Brewster. 2020. A Survey of Mid-Air Ultrasound Haptics and Its Applications. IEEE Transactions on Haptics(2020).Google ScholarCross Ref
- Nimesha Ranasinghe, Pravar Jain, Shienny Karwita, David Tolley, and Ellen Yi-Luen Do. 2017. Ambiotherm: Enhancing Sense of Presence in Virtual Reality by Simulating Real-World Environmental Conditions. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems. 1731–1742.Google ScholarDigital Library
- Nimesha Ranasinghe, Pravar Jain, Nguyen Thi Ngoc Tram, Koon Chuan Raymond Koh, David Tolley, Shienny Karwita, Lin Lien-Ya, Yan Liangkun, Kala Shamaiah, Chow Eason Wai Tung, 2018. Season Traveller: Multisensory Narration for Enhancing the Virtual Reality Experience. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems. 1–13.Google ScholarDigital Library
- Thijs Roumen, Simon T Perrault, and Shengdong Zhao. 2015. NotiRing: A Comparative Study of Notification Channels for Wearable Interactive Rings. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems. 2497–2500.Google ScholarDigital Library
- Hooman Aghaebrahimi Samani, Rahul Parsani, Lenis Tejada Rodriguez, Elham Saadatian, Kumudu Harshadeva Dissanayake, and Adrian David Cheok. 2012. Kissenger: Design of a Kiss Transmission Device. In Proceedings of the Designing Interactive Systems Conference. 48–57.Google ScholarDigital Library
- Antti Sand, Ismo Rakkolainen, Poika Isokoski, Jari Kangas, Roope Raisamo, and Karri Palovuori. 2015. Head-Mounted Display with Mid-Air Tactile Feedback. In Proceedings of the 21st ACM Symposium on Virtual Reality Software and Technology. 51–58.Google ScholarDigital Library
- Antti Sand, Ismo Rakkolainen, Veikko Surakka, Roope Raisamo, and Stephen Brewster. 2020. Evaluating Ultrasonic Tactile Feedback Stimuli. In International Conference on Human Haptic Sensing and Touch Enabled Computer Applications. Springer, 253–261.Google Scholar
- Gözel Shakeri, John H Williamson, and Stephen Brewster. 2018. May the Force Be With You: Ultrasound Haptic Feedback for Mid-Air Gesture Interaction in Cars. In Proceedings of the 10th International Conference on Automotive User Interfaces and Interactive Vehicular Applications. 1–10.Google ScholarDigital Library
- Maria Z Siemionow, Bahar Bassiri Gharb, and Antonio Rampazzo. 2011. The Face as a Sensory Organ. In The Know-How of Face Transplantation. Springer, 11–23.Google Scholar
- Rajinder Sodhi, Ivan Poupyrev, Matthew Glisson, and Ali Israr. 2013. AIREAL: Interactive Tactile Experiences in Free Air. ACM Transactions on Graphics (TOG) 32, 4 (2013), 1–10.Google ScholarDigital Library
- Sunghyun Song, Geeyoung Noh, Junwoo Yoo, Ian Oakley, Jundong Cho, and Andrea Bianchi. 2015. Hot & Tight: Exploring Thermo and Squeeze Cues Recognition on Wrist Wearables. In Proceedings of the 2015 ACM International Symposium on Wearable Computers. 39–42.Google ScholarDigital Library
- Daniel Spelmezan, Deepak Ranjan Sahoo, and Sriram Subramanian. 2017. Sparkle: Hover Feedback with Touchable Electric Arcs. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems. 3705–3717.Google ScholarDigital Library
- Shun Suzuki, Masahiro Fujiwara, Yasutoshi Makino, and Hiroyuki Shinoda. 2020. Reducing Amplitude Fluctuation by Gradual Phase Shift in Midair Ultrasound Haptics. IEEE Transactions on Haptics 13, 1 (2020), 87–93.Google ScholarDigital Library
- Shun Suzuki, Seki Inoue, Masahiro Fujiwara, Yasutoshi Makino, and Hiroyuki Shinoda. 2021. AUTD3: Scalable Airborne Ultrasound Tactile Display. IEEE Transactions on Haptics(2021).Google Scholar
- Shun Suzuki, Ryoko Takahashi, Mitsuru Nakajima, Keisuke Hasegawa, Yasutoshi Makino, and Hiroyuki Shinoda. 2018. Midair haptic display to human upper body. In 2018 57th Annual Conference of the Society of Instrument and Control Engineers of Japan (SICE). IEEE, 848–853.Google ScholarCross Ref
- Ryoko Takahashi, Keisuke Hasegawa, and Hiroyuki Shinoda. 2018. Lateral Modulation of Midair Ultrasound Focus for Intensified Vibrotactile Stimuli. In International Conference on Human Haptic Sensing and Touch Enabled Computer Applications. Springer, 276–288.Google Scholar
- Ryoko Takahashi, Keisuke Hasegawa, and Hiroyuki Shinoda. 2020. Tactile Stimulation by Repetitive Lateral Movement of Midair Ultrasound Focus. IEEE Transactions on Haptics 13, 2 (2020), 334–342.Google ScholarDigital Library
- Mats Trulsson and Greg K Essick. 2010. Sensations Evoked by Microstimulation of Single Mechanoreceptive Afferents Innervating the Human Face and Mouth. Journal of neurophysiology 103, 4 (2010), 1741–1747.Google ScholarCross Ref
- Mohamed Yassine Tsalamlal, Nizar Ouarti, Jean-Claude Martin, and Mehdi Ammi. 2013. EmotionAir: Perception of Emotions from Air Jet Based Tactile Stimulation. In 2013 Humaine Association Conference on Affective Computing and Intelligent Interaction. IEEE, 215–220.Google ScholarDigital Library
- Hiroshi Ueno, Masayuki Kaneda, and Masataka Tsukino. 1994. Development of Drowsiness Detection System. In Proceedings of VNIS’94-1994 Vehicle Navigation and Information Systems Conference. IEEE, 15–20.Google ScholarCross Ref
- Yunqing Wang and Katherine J Kuchenbecker. 2012. HALO: Haptic Alerts for Low-hanging Obstacles in White Cane Navigation. In 2012 IEEE Haptics Symposium (HAPTICS). IEEE, 527–532.Google Scholar
- Sidney Weinstein. 1968. Intensive and Extensive Aspects of Tactile Sensitivity as a Function of Body Part, Sex and Laterality. The skin senses (1968).Google Scholar
- Alexander Wilberz, Dominik Leschtschow, Christina Trepkowski, Jens Maiero, Ernst Kruijff, and Bernhard Riecke. 2020. FaceHaptics: Robot Arm based Versatile Facial Haptics for Immersive Environments. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems. 1–14.Google ScholarDigital Library
Recommendations
Sensing Ultrasonic Mid-Air Haptics with a Biomimetic Tactile Fingertip
Haptics: Science, Technology, ApplicationsAbstractUltrasonic phased arrays are used to generate mid-air haptic feedback, allowing users to feel sensations in mid-air. In this work, we present a method for testing mid-air haptics with a biomimetic tactile sensor that is inspired by the human ...
Body-Penetrating Tactile Phantom Sensations
CHI '20: Proceedings of the 2020 CHI Conference on Human Factors in Computing SystemsIn tactile interaction, a phantom sensation refers to an illusion felt on the skin between two distant points at which vibrations are applied. It can improve the perceptual spatial resolution of tactile stimulation with a few tactors. All phantom ...
I'm Sensing in the Rain: Spatial Incongruity in Visual-Tactile Mid-Air Stimulation Can Elicit Ownership in VR Users
CHI '19: Proceedings of the 2019 CHI Conference on Human Factors in Computing SystemsMajor virtual reality (VR) companies are trying to enhance the sense of immersion in virtual environments by implementing haptic feedback in their systems (e.g., Oculus Touch). It is known that tactile stimulation adds realism to a virtual environment. ...
Comments