skip to main content
10.1145/3472749.3474732acmconferencesArticle/Chapter ViewAbstractPublication PagesuistConference Proceedingsconference-collections
research-article

LipNotif: Use of Lips as a Non-Contact Tactile Notification Interface Based on Ultrasonic Tactile Presentation

Published:12 October 2021Publication History

ABSTRACT

We propose LipNotif, a non-contact tactile notification system that uses airborne ultrasound tactile presentation to lips. Lips are suitable for non-contact tactile notifications because they have high tactile sensitivity comparable to the palms, are less occupied in daily life, and are constantly exposed outward. LipNotif uses tactile patterns to intuitively convey information to users, allowing them to receive notifications using only their lips, without sight, hearing, or hands. We developed a prototype system that automatically recognizes the position of the lips and presents non-contact tactile sensations. Two experiments were conducted to evaluate the feasibility of LipNotif. In the first experiment, we found that directional information can be notified to the lips with an average accuracy of ± 11.1° in the 120° horizontal range. In the second experiment, we could elicit significantly different affective responses by changing the stimulus intensity. The experimental results indicated that LipNotif is practical for conveying directions, emotions, and combinations of them. LipNotif can be applied for various purposes, such as notifications during work, calling in the waiting room, and tactile feedback in automotive user interfaces.

Skip Supplemental Material Section

Supplemental Material

p13-video_figure.mp4

mp4

75.9 MB

p13-video_preview.mp4

mp4

43.1 MB

p13-talk.mp4

Talk video and captions

mp4

149.9 MB

References

  1. Tomohiro Amemiya and Hisashi Sugiyama. 2009. Navigation in Eight Cardinal Directions with Pseudo-Attraction Force for the Visually Impaired. In 2009 IEEE International Conference on Systems, Man and Cybernetics. IEEE, 27–32.Google ScholarGoogle Scholar
  2. Kentaro Ariga, Masahiro Fujiwara, Yasutoshi Makino, and Hiroyuki Shinoda. 2020. Midair Haptic Presentation Using Concave Reflector. In International Conference on Human Haptic Sensing and Touch Enabled Computer Applications. Springer, 307–315.Google ScholarGoogle Scholar
  3. Karlene K Ball, Bettina L Beard, Daniel L Roenker, Richard L Miller, and David S Griggs. 1988. Age and visual search: Expanding the useful field of view. JOSA A 5, 12 (1988), 2210–2219.Google ScholarGoogle ScholarCross RefCross Ref
  4. Tadas Baltrusaitis, Amir Zadeh, Yao Chong Lim, and Louis-Philippe Morency. 2018. OpenFace 2.0: Facial Behavior Analysis Toolkit. In 2018 13th IEEE International Conference on Automatic Face & Gesture Recognition (FG 2018). IEEE, 59–66.Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. Steven M Barlow. 1987. Mechanical Frequency Detection Thresholds in the Human Face. Experimental neurology 96, 2 (1987), 253–261.Google ScholarGoogle Scholar
  6. Margaret M Bradley and Peter J Lang. 1994. Measuring emotion: The self-assessment manikin and the semantic differential. Journal of behavior therapy and experimental psychiatry 25, 1(1994), 49–59.Google ScholarGoogle ScholarCross RefCross Ref
  7. Stephen A Brewster and Lorna M Brown. 2004. Tactons: Structured Tactile Messages for Non-Visual Information Display. (2004).Google ScholarGoogle Scholar
  8. Tom Carter, Sue Ann Seah, Benjamin Long, Bruce Drinkwater, and Sriram Subramanian. 2013. UltraHaptics: Multi-Point Mid-Air Haptic Feedback for Touch Surfaces. In Proceedings of the 26th Annual ACM Symposium on User Interface Software and Technology. 505–514.Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. Piotr Dalka and Andrzej Czyzewski. 2010. Human-Computer Interface Based on Visual Lip Movement and Gesture Recognition. Int. J. Comput. Sci. Appl. 7, 3 (2010), 124–139.Google ScholarGoogle Scholar
  10. Mohamad A Eid and Hussein Al Osman. 2015. Affective Haptics: Current Research and Future Directions. IEEE Access 4(2015), 26–40.Google ScholarGoogle ScholarCross RefCross Ref
  11. William Frier, Damien Ablart, Jamie Chilles, Benjamin Long, Marcello Giordano, Marianna Obrist, and Sriram Subramanian. 2018. Using Spatiotemporal Modulation to Draw Tactile Patterns in Mid-air. In International Conference on Human Haptic Sensing and Touch Enabled Computer Applications. Springer, 270–281.Google ScholarGoogle ScholarCross RefCross Ref
  12. Jay D Fuletra and Dulari Bosamiya. 2013. A Survey on Drivers Drowsiness Detection Techniques. International Journal on Recent and Innovation Trends in Computing and Communication 1, 11 (2013), 816–819.Google ScholarGoogle Scholar
  13. Orestis Georgiou, Valerio Biscione, Adam Harwood, Daniel Griffiths, Marcello Giordano, Ben Long, and Tom Carter. 2017. Haptic In-Vehicle Gesture Controls. In Proceedings of the 9th International Conference on Automotive User Interfaces and Interactive Vehicular Applications Adjunct. 233–238.Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. Hyunjae Gil, Hyungki Son, Jin Ryong Kim, and Ian Oakley. 2018. Whiskers: Exploring the Use of Ultrasonic Haptic Cues on the Face. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems. 1–13.Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. Aakar Gupta, Antony Albert Raj Irudayaraj, and Ravin Balakrishnan. 2017. HapticClench: Investigating Squeeze Sensations Using Memory Alloys. In Proceedings of the 30th Annual ACM Symposium on User Interface Software and Technology. 109–117.Google ScholarGoogle ScholarDigital LibraryDigital Library
  16. Sidhant Gupta, Dan Morris, Shwetak N Patel, and Desney Tan. 2013. AirWave: Non-Contact Haptic Feedback Using Air Vortex Rings. In Proceedings of the 2013 ACM International Joint Conference on Pervasive and Ubiquitous Computing. 419–428.Google ScholarGoogle ScholarDigital LibraryDigital Library
  17. Kyle Harrington, David R Large, Gary Burnett, and Orestis Georgiou. 2018. Exploring the Use of Mid-Air Ultrasonic Feedback to Enhance Automotive User Interfaces. In Proceedings of the 10th international conference on automotive user interfaces and interactive vehicular applications. 11–20.Google ScholarGoogle ScholarDigital LibraryDigital Library
  18. Keisuke Hasegawa and Hiroyuki Shinoda. 2018. Aerial Vibrotactile Display Based on Multiunit Ultrasound Phased Array. IEEE Transactions on Haptics 11, 3 (2018), 367–377.Google ScholarGoogle ScholarCross RefCross Ref
  19. Yuki Hashimoto, Naohisa Nagaya, Minoru Kojima, Satoru Miyajima, Junichiro Ohtaki, Akio Yamamoto, Tomoyasu Mitani, and Masahiko Inami. 2006. Straw-like User Interface: Virtual experience of the sensation of drinking using a straw. In Proceedings of the 2006 ACM SIGCHI International Conference on Advances in Computer Entertainment Technology. 50–es.Google ScholarGoogle ScholarDigital LibraryDigital Library
  20. Takayuki Hoshi. 2017. Noise Reduction of Airborne Ultrasound Tactile Display on Moving Stimulation Point. In Trans. Virtual Reality Soc. Japan, Vol. 22. 293–300.Google ScholarGoogle Scholar
  21. Takayuki Hoshi, Masafumi Takahashi, Takayuki Iwamoto, and Hiroyuki Shinoda. 2010. Noncontact Tactile Display Based on Radiation Pressure of Airborne Ultrasound. IEEE Transactions on Haptics 3, 3 (2010), 155–165.Google ScholarGoogle ScholarDigital LibraryDigital Library
  22. Alexandra Ion, Edward Jay Wang, and Patrick Baudisch. 2015. Skin Drag Displays: Dragging a Physical Tactor across the User’s Skin Produces a Stronger Tactile Stimulus than Vibrotactile. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems. 2501–2504.Google ScholarGoogle ScholarDigital LibraryDigital Library
  23. Takayuki Iwamoto, Mari Tatezono, and Hiroyuki Shinoda. 2008. Non-contact Method for Producing Tactile Sensation Using Airborne Ultrasound. In International Conference on Human Haptic Sensing and Touch Enabled Computer Applications. Springer, 504–513.Google ScholarGoogle Scholar
  24. Seungwoo Je, Minkyeong Lee, Yoonji Kim, Liwei Chan, Xing-Dong Yang, and Andrea Bianchi. 2018. PokeRing: Notifications by Poking Around the Finger. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems. 1–10.Google ScholarGoogle ScholarDigital LibraryDigital Library
  25. Seungwoo Je, Brendan Rooney, Liwei Chan, and Andrea Bianchi. 2017. tactoRing: A Skin-Drag Discrete Display. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems. 3106–3114.Google ScholarGoogle ScholarDigital LibraryDigital Library
  26. Arata Jingu, Masahiro Fujiwara, Yasutoshi Makino, and Hiroyuki Shionda. 2021. Tactile Perception Characteristics of Lips Stimulated by Airborne Ultrasound. In 2021 IEEE World Haptics Conference (WHC).Google ScholarGoogle ScholarCross RefCross Ref
  27. Takaaki Kamigaki, Akihito Noda, and Hiroyuki Shinoda. 2017. Thin and Flexible Airborne Ultrasound Phased Array for Tactile Display. In 2017 56th Annual Conference of the Society of Instrument and Control Engineers of Japan (SICE). IEEE, 736–739.Google ScholarGoogle Scholar
  28. Ariga Kentaro, Masahiro Fujiwara, Yasutoshi Makino, and Hiroyuki Shionda. 2021. Workspace Evaluation of Long-Distance Midair Haptic Display Using Curved Reflector. In 2021 IEEE World Haptics Conference (WHC).Google ScholarGoogle Scholar
  29. Davis E. King. 2009. Dlib-ml: A Machine Learning Toolkit. Journal of Machine Learning Research 10 (2009), 1755–1758.Google ScholarGoogle ScholarDigital LibraryDigital Library
  30. Hojin Lee, Ji-Sun Kim, Seungmoon Choi, Jae-Hoon Jun, Jong-Rak Park, A-Hee Kim, Han-Byeol Oh, Hyung-Sik Kim, and Soon-Cheol Chung. 2015. Mid-Air Tactile Stimulation Using Laser-Induced Thermoelastic Effects: The First Study for Indirect Radiation. In 2015 IEEE World Haptics Conference (WHC). IEEE, 374–380.Google ScholarGoogle Scholar
  31. Jaeyeon Lee and Geehyuk Lee. 2016. Designing a Non-contact Wearable Tactile Display Using Airflows. In Proceedings of the 29th Annual Symposium on User Interface Software and Technology. 183–194.Google ScholarGoogle ScholarDigital LibraryDigital Library
  32. Minkyeong Lee, Seungwoo Je, Woojin Lee, Daniel Ashbrook, and Andrea Bianchi. 2019. ActivEarring: Spatiotemporal Haptic Cues on the Ears. IEEE Transactions on Haptics 12, 4 (2019), 554–562.Google ScholarGoogle ScholarDigital LibraryDigital Library
  33. Seungyon” Claire” Lee and Thad Starner. 2010. BuzzWear: Alert Perception in Wearable Tactile Displays on the Wrist. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. 433–442.Google ScholarGoogle ScholarDigital LibraryDigital Library
  34. Wei Liu and Hui Tang. 2005. An initial study on lip perception of electrotactile array stimulation.Journal of Rehabilitation Research & Development 42, 5(2005).Google ScholarGoogle Scholar
  35. Line S Löken, Johan Wessberg, Francis McGlone, and Håkan Olausson. 2009. Coding of pleasant touch by unmyelinated afferents in humans. Nature neuroscience 12, 5 (2009), 547–548.Google ScholarGoogle Scholar
  36. Flavia Mancini, Armando Bauleo, Jonathan Cole, Fausta Lui, Carlo A Porro, Patrick Haggard, and Gian Domenico Iannetti. 2014. Whole-Body Mapping of Spatial Acuity for Pain and Touch. Annals of neurology 75, 6 (2014), 917–924.Google ScholarGoogle ScholarCross RefCross Ref
  37. Saya Mizutani, Masahiro Fujiwara, Yasutoshi Makino, and Hiroyuki Shinoda. 2019. Thresholds of Haptic and Auditory Perception in Midair Facial Stimulation. In 2019 IEEE International Symposium on Haptic, Audio and Visual Environments and Games (HAVE). IEEE, 1–6.Google ScholarGoogle Scholar
  38. Yasuaki Monnai, Keisuke Hasegawa, Masahiro Fujiwara, Kazuma Yoshino, Seki Inoue, and Hiroyuki Shinoda. 2014. HaptoMime: Mid-Air Haptic Interaction with a Floating Virtual Screen. In Proceedings of the 27th Annual ACM Symposium on User Interface Software and Technology. 663–667.Google ScholarGoogle ScholarDigital LibraryDigital Library
  39. Thomas R Nelson, J Brian Fowlkes, Jacques S Abramowicz, and Charles C Church. 2009. Ultrasound Biosafety Considerations for the Practicing Sonographer and Sonologist. (2009).Google ScholarGoogle Scholar
  40. Mohd Adili Norasikin, Diego Martinez Plasencia, Spyros Polychronopoulos, Gianluca Memoli, Yutaka Tokuda, and Sriram Subramanian. 2018. SoundBender: Dynamic Acoustic Control Behind Obstacles. In Proceedings of the 31st Annual ACM Symposium on User Interface Software and Technology. 247–259.Google ScholarGoogle ScholarDigital LibraryDigital Library
  41. Marianna Obrist, Sriram Subramanian, Elia Gatti, Benjamin Long, and Thomas Carter. 2015. Emotions Mediated Through Mid-Air Haptics. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems. 2053–2062.Google ScholarGoogle ScholarDigital LibraryDigital Library
  42. Viktorija Paneva, Sofia Seinfeld, Michael Kraiczi, and Jörg Müller. 2020. HaptiRead: Reading Braille as Mid-Air Haptic Information. Association for Computing Machinery, New York, NY, USA, 13–20. https://doi.org/10.1145/3357236.3395515Google ScholarGoogle Scholar
  43. Roshan Lalitha Peiris, Yuan-Ling Feng, Liwei Chan, and Kouta Minamizawa. 2019. ThermalBracelet: Exploring Thermal Haptic Feedback Around the Wrist. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems. 1–11.Google ScholarGoogle ScholarDigital LibraryDigital Library
  44. Henning Pohl, Peter Brandes, Hung Ngo Quang, and Michael Rohs. 2017. Squeezeback: Pneumatic Compression for Notifications. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems. 5318–5330.Google ScholarGoogle ScholarDigital LibraryDigital Library
  45. Ismo Rakkolainen, Euan Freeman, Antti Sand, Roope Raisamo, and Stephen Brewster. 2020. A Survey of Mid-Air Ultrasound Haptics and Its Applications. IEEE Transactions on Haptics(2020).Google ScholarGoogle ScholarCross RefCross Ref
  46. Nimesha Ranasinghe, Pravar Jain, Shienny Karwita, David Tolley, and Ellen Yi-Luen Do. 2017. Ambiotherm: Enhancing Sense of Presence in Virtual Reality by Simulating Real-World Environmental Conditions. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems. 1731–1742.Google ScholarGoogle ScholarDigital LibraryDigital Library
  47. Nimesha Ranasinghe, Pravar Jain, Nguyen Thi Ngoc Tram, Koon Chuan Raymond Koh, David Tolley, Shienny Karwita, Lin Lien-Ya, Yan Liangkun, Kala Shamaiah, Chow Eason Wai Tung, 2018. Season Traveller: Multisensory Narration for Enhancing the Virtual Reality Experience. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems. 1–13.Google ScholarGoogle ScholarDigital LibraryDigital Library
  48. Thijs Roumen, Simon T Perrault, and Shengdong Zhao. 2015. NotiRing: A Comparative Study of Notification Channels for Wearable Interactive Rings. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems. 2497–2500.Google ScholarGoogle ScholarDigital LibraryDigital Library
  49. Hooman Aghaebrahimi Samani, Rahul Parsani, Lenis Tejada Rodriguez, Elham Saadatian, Kumudu Harshadeva Dissanayake, and Adrian David Cheok. 2012. Kissenger: Design of a Kiss Transmission Device. In Proceedings of the Designing Interactive Systems Conference. 48–57.Google ScholarGoogle ScholarDigital LibraryDigital Library
  50. Antti Sand, Ismo Rakkolainen, Poika Isokoski, Jari Kangas, Roope Raisamo, and Karri Palovuori. 2015. Head-Mounted Display with Mid-Air Tactile Feedback. In Proceedings of the 21st ACM Symposium on Virtual Reality Software and Technology. 51–58.Google ScholarGoogle ScholarDigital LibraryDigital Library
  51. Antti Sand, Ismo Rakkolainen, Veikko Surakka, Roope Raisamo, and Stephen Brewster. 2020. Evaluating Ultrasonic Tactile Feedback Stimuli. In International Conference on Human Haptic Sensing and Touch Enabled Computer Applications. Springer, 253–261.Google ScholarGoogle Scholar
  52. Gözel Shakeri, John H Williamson, and Stephen Brewster. 2018. May the Force Be With You: Ultrasound Haptic Feedback for Mid-Air Gesture Interaction in Cars. In Proceedings of the 10th International Conference on Automotive User Interfaces and Interactive Vehicular Applications. 1–10.Google ScholarGoogle ScholarDigital LibraryDigital Library
  53. Maria Z Siemionow, Bahar Bassiri Gharb, and Antonio Rampazzo. 2011. The Face as a Sensory Organ. In The Know-How of Face Transplantation. Springer, 11–23.Google ScholarGoogle Scholar
  54. Rajinder Sodhi, Ivan Poupyrev, Matthew Glisson, and Ali Israr. 2013. AIREAL: Interactive Tactile Experiences in Free Air. ACM Transactions on Graphics (TOG) 32, 4 (2013), 1–10.Google ScholarGoogle ScholarDigital LibraryDigital Library
  55. Sunghyun Song, Geeyoung Noh, Junwoo Yoo, Ian Oakley, Jundong Cho, and Andrea Bianchi. 2015. Hot & Tight: Exploring Thermo and Squeeze Cues Recognition on Wrist Wearables. In Proceedings of the 2015 ACM International Symposium on Wearable Computers. 39–42.Google ScholarGoogle ScholarDigital LibraryDigital Library
  56. Daniel Spelmezan, Deepak Ranjan Sahoo, and Sriram Subramanian. 2017. Sparkle: Hover Feedback with Touchable Electric Arcs. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems. 3705–3717.Google ScholarGoogle ScholarDigital LibraryDigital Library
  57. Shun Suzuki, Masahiro Fujiwara, Yasutoshi Makino, and Hiroyuki Shinoda. 2020. Reducing Amplitude Fluctuation by Gradual Phase Shift in Midair Ultrasound Haptics. IEEE Transactions on Haptics 13, 1 (2020), 87–93.Google ScholarGoogle ScholarDigital LibraryDigital Library
  58. Shun Suzuki, Seki Inoue, Masahiro Fujiwara, Yasutoshi Makino, and Hiroyuki Shinoda. 2021. AUTD3: Scalable Airborne Ultrasound Tactile Display. IEEE Transactions on Haptics(2021).Google ScholarGoogle Scholar
  59. Shun Suzuki, Ryoko Takahashi, Mitsuru Nakajima, Keisuke Hasegawa, Yasutoshi Makino, and Hiroyuki Shinoda. 2018. Midair haptic display to human upper body. In 2018 57th Annual Conference of the Society of Instrument and Control Engineers of Japan (SICE). IEEE, 848–853.Google ScholarGoogle ScholarCross RefCross Ref
  60. Ryoko Takahashi, Keisuke Hasegawa, and Hiroyuki Shinoda. 2018. Lateral Modulation of Midair Ultrasound Focus for Intensified Vibrotactile Stimuli. In International Conference on Human Haptic Sensing and Touch Enabled Computer Applications. Springer, 276–288.Google ScholarGoogle Scholar
  61. Ryoko Takahashi, Keisuke Hasegawa, and Hiroyuki Shinoda. 2020. Tactile Stimulation by Repetitive Lateral Movement of Midair Ultrasound Focus. IEEE Transactions on Haptics 13, 2 (2020), 334–342.Google ScholarGoogle ScholarDigital LibraryDigital Library
  62. Mats Trulsson and Greg K Essick. 2010. Sensations Evoked by Microstimulation of Single Mechanoreceptive Afferents Innervating the Human Face and Mouth. Journal of neurophysiology 103, 4 (2010), 1741–1747.Google ScholarGoogle ScholarCross RefCross Ref
  63. Mohamed Yassine Tsalamlal, Nizar Ouarti, Jean-Claude Martin, and Mehdi Ammi. 2013. EmotionAir: Perception of Emotions from Air Jet Based Tactile Stimulation. In 2013 Humaine Association Conference on Affective Computing and Intelligent Interaction. IEEE, 215–220.Google ScholarGoogle ScholarDigital LibraryDigital Library
  64. Hiroshi Ueno, Masayuki Kaneda, and Masataka Tsukino. 1994. Development of Drowsiness Detection System. In Proceedings of VNIS’94-1994 Vehicle Navigation and Information Systems Conference. IEEE, 15–20.Google ScholarGoogle ScholarCross RefCross Ref
  65. Yunqing Wang and Katherine J Kuchenbecker. 2012. HALO: Haptic Alerts for Low-hanging Obstacles in White Cane Navigation. In 2012 IEEE Haptics Symposium (HAPTICS). IEEE, 527–532.Google ScholarGoogle Scholar
  66. Sidney Weinstein. 1968. Intensive and Extensive Aspects of Tactile Sensitivity as a Function of Body Part, Sex and Laterality. The skin senses (1968).Google ScholarGoogle Scholar
  67. Alexander Wilberz, Dominik Leschtschow, Christina Trepkowski, Jens Maiero, Ernst Kruijff, and Bernhard Riecke. 2020. FaceHaptics: Robot Arm based Versatile Facial Haptics for Immersive Environments. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems. 1–14.Google ScholarGoogle ScholarDigital LibraryDigital Library

Recommendations

Comments

Login options

Check if you have access through your login credentials or your institution to get full access on this article.

Sign in

PDF Format

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

HTML Format

View this article in HTML Format .

View HTML Format