skip to main content
research-article

RimSense: Enabling Touch-based Interaction on Eyeglass Rim Using Piezoelectric Sensors

Published: 12 January 2024 Publication History

Abstract

Smart eyewear's interaction mode has attracted significant research attention. While most commercial devices have adopted touch panels situated on the temple front of eyeglasses for interaction, this paper identifies a drawback stemming from the unparalleled plane between the touch panel and the display, which disrupts the direct mapping between gestures and the manipulated objects on display. Therefore, this paper proposes RimSense, a proof-of-concept design for smart eyewear, to introduce an alternative realm for interaction - touch gestures on eyewear rim. RimSense leverages piezoelectric (PZT) transducers to convert the eyeglass rim into a touch-sensitive surface. When users touch the rim, the alteration in the eyeglass's structural signal manifests its effect into a channel frequency response (CFR). This allows RimSense to recognize the executed touch gestures based on the collected CFR patterns. Technically, we employ a buffered chirp as the probe signal to fulfil the sensing granularity and noise resistance requirements. Additionally, we present a deep learning-based gesture recognition framework tailored for fine-grained time sequence prediction and further integrated with a Finite-State Machine (FSM) algorithm for event-level prediction to suit the interaction experience for gestures of varying durations. We implement a functional eyewear prototype with two commercial PZT transducers. RimSense can recognize eight touch gestures on the eyeglass rim and estimate gesture durations simultaneously, allowing gestures of varying lengths to serve as distinct inputs. We evaluate the performance of RimSense on 30 subjects and show that it can sense eight gestures and an additional negative class with an F1-score of 0.95 and a relative duration estimation error of 11%. We further make the system work in real-time and conduct a user study on 14 subjects to assess the practicability of RimSense through interactions with two demo applications. The user study demonstrates RimSense's good performance, high usability, learnability and enjoyability. Additionally, we conduct interviews with the subjects, and their comments provide valuable insight for future eyewear design.

Supplementary Material

xie (xie.zip)
Supplemental movie, appendix, image and software files for, RimSense: Enabling Touch-based Interaction on Eyeglass Rim Using Piezoelectric Sensors

References

[1]
2022. AR Glasses Market Size, Trends, Growth, Industry Analysis 2025. https://www.fairfieldmarketresearch.com/report/ar-glasses-market Accessed Mar 13, 2023.
[2]
2023. Augmented Reality and Mixed Reality | by MOVERIO | Epson.com | Epson US. https://epson.com/moverio-augmented-reality Accessed Mar 10, 2022.
[3]
2023. Buy Apple Watch Series 8. https://www.apple.com/shop/buy-watch/apple-watch Accessed Mar 10, 2022.
[4]
2023. Glass. https://www.google.com/glass/start/ Accessed Mar 13, 2022.
[5]
2023. Home | PUI Audio. https://puiaudio.com/ Accessed Mar 10, 2022.
[6]
2023. Homepage | Focusrite. https://focusrite.com/en Accessed Mar 10, 2022.
[7]
2023. Iristick - Smart glasses built for every industry. https://iristick.com/ Accessed Mar 10, 2022.
[8]
2023. Rokid Glass 2 | Everyday AR Glasses Build for Enterprises. https://rokid.ai/products/rokid-glass-2/ Accessed Mar 10, 2022.
[9]
2023. Smart Glasses by solos® | Personalize your Audio & Style with AirGo™ 2. https://solosglasses.com/ Accessed Mar 10, 2022.
[10]
2023. Vuzix | Heads-Up, Hands-Free AR Smart Glasses. https://www.vuzix.com/ Accessed Mar 10, 2022.
[11]
Mahmoud Al Ahmad. 2016. Piezoelectric extraction of ECG signal. Scientific Reports 6 (Nov. 2016), 37093.
[12]
Sunggeun Ahn and Geehyuk Lee. 2019. Gaze-Assisted Typing for Smart Glasses. In Proceedings of the 32nd Annual ACM Symposium on User Interface Software and Technology (UIST '19). Association for Computing Machinery, New York, NY, USA, 857--869.
[13]
Mahmoud Al Ahmad and Soha Ahmed. 2017. Heart-rate and pressure-rate determination using piezoelectric sensor from the neck. 1--5.
[14]
Areen Allataifeh and Mahmoud Al Ahmad. 2020. Simultaneous piezoelectric noninvasive detection of multiple vital signs. Scientific Reports 10, 1 (Jan. 2020), 416.
[15]
Huidong Bai, Gun Lee, and Mark Billinghurst. 2014. Using 3D hand gestures and touch input for wearable AR interaction. In CHI '14 Extended Abstracts on Human Factors in Computing Systems (CHI EA '14). Association for Computing Machinery, New York, NY, USA, 1321--1326.
[16]
Aaron Bangor, Philip T. Kortum, and James T. Miller. 2008. An Empirical Evaluation of the System Usability Scale. International Journal of Human--Computer Interaction 24, 6 (July 2008), 574--594.
[17]
Mayra D. Barrera-Machuca, Alvaro Cassinelli, and Christian Sandor. 2020. Context-Based 3D Grids for Augmented Reality User Interfaces. In Adjunct Proceedings of the 33rd Annual ACM Symposium on User Interface Software and Technology (UIST '20 Adjunct). Association for Computing Machinery, New York, NY, USA, 73--76.
[18]
Eugenie Brasier, Olivier Chapuis, Nicolas Ferey, Jeanne Vezien, and Caroline Appert. 2020. ARPads: Mid-air Indirect Input for Augmented Reality. In 2020 IEEE International Symposium on Mixed and Augmented Reality (ISMAR). 332--343. ISSN: 1554-7868.
[19]
Han Joo Chae, Jeong-in Hwang, and Jinwook Seo. 2018. Wall-based Space Manipulation Technique for Efficient Placement of Distant Objects in Augmented Reality. In Proceedings of the 31st Annual ACM Symposium on User Interface Software and Technology (UIST '18). Association for Computing Machinery, New York, NY, USA, 45--52.
[20]
Andrea Colaço, Ahmed Kirmani, Hye Soo Yang, Nan-Wei Gong, Chris Schmandt, and Vivek K. Goyal. 2013. Mime: compact, low power 3D gesture sensing for interaction with head mounted displays. In Proceedings of the 26th annual ACM symposium on User interface software and technology (UIST '13). Association for Computing Machinery, New York, NY, USA, 227--236.
[21]
Ramen Dutta, Andre B.J. Kokkeler, Ronan v. d. Zee, and Mark J. Bentum. 2011. Performance of chirped-FSK and chirped-PSK in the presence of partial-band interference. In 2011 18th IEEE Symposium on Communications and Vehicular Technology in the Benelux (SCVT). 1--6.
[22]
Augusto Esteves, David Verweij, Liza Suraiya, Rasel Islam, Youryang Lee, and Ian Oakley. 2017. SmoothMoves: Smooth Pursuits Head Movements for Augmented Reality. In Proceedings of the 30th Annual ACM Symposium on User Interface Software and Technology (UIST '17). Association for Computing Machinery, New York, NY, USA, 167--178.
[23]
Xiaoran Fan, Daewon Lee, Larry Jackel, Richard Howard, Daniel Lee, and Volkan Isler. 2022. Enabling Low-Cost Full Surface Tactile Skin for Human Robot Interaction. IEEE Robotics and Automation Letters 7, 2 (April 2022), 1800--1807. Conference Name: IEEE Robotics and Automation Letters.
[24]
Ehsan Ghafari, Ying Yuan, Chen Wu, Tommy Nantung, and Na Lu. 2018. Evaluation the compressive strength of the cement paste blended with supplementary cementitious materials using a piezoelectric-based sensor. Construction and Building Materials 171 (May 2018), 504--510.
[25]
Philip Graybill and Mehdi Kiani. 2019. Eyelid Drive System: An Assistive Technology Employing Inductive Sensing of Eyelid Movement. IEEE Transactions on Biomedical Circuits and Systems 13, 1 (Feb. 2019), 203--213.
[26]
Kaiming He, Xiangyu Zhang, Shaoqing Ren, and Jian Sun. 2016. Deep residual learning for image recognition. In Proceedings of the IEEE conference on computer vision and pattern recognition. 770--778.
[27]
Gunther Heidemann, Ingo Bax, and Holger Bekel. 2004. Multimodal interaction in an augmented reality scenario. In Proceedings of the 6th international conference on Multimodal interfaces (ICMI '04). Association for Computing Machinery, New York, NY, USA, 53--60.
[28]
Steven Hickson, Nick Dufour, Avneesh Sud, Vivek Kwatra, and Irfan Essa. 2019. Eyemotion: Classifying Facial Expressions in VR Using Eye-Tracking Cameras. In 2019 IEEE Winter Conference on Applications of Computer Vision (WACV). IEEE, Waikoloa Village, HI, USA, 1626--1635.
[29]
Yuya Igarashi, Kyosuke Futami, and Kazuya Murao. 2022. Silent Speech Eyewear Interface: Silent Speech Recognition Method using Eyewear with Infrared Distance Sensors. In Proceedings of the 2022 ACM International Symposium on Wearable Computers (ISWC '22). Association for Computing Machinery, New York, NY, USA, 33--38.
[30]
Brian Kenji Iwana and Seiichi Uchida. 2021. An empirical survey of data augmentation for time series classification with neural networks. PLOS ONE 16, 7 (July 2021), e0254841. Publisher: Public Library of Science.
[31]
Shahram Jalaliniya and Diako Mardanbegi. 2016. Seamless interaction with scrolling contents on eyewear computers using optokinetic nystagmus eye movements. In Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications. ACM, Charleston South Carolina, 295--298.
[32]
Haik Kalantarian, Nabil Alshurafa, Tuan Le, and Majid Sarrafzadeh. 2015. Monitoring eating habits using a piezoelectric sensor-based necklace. Computers in Biology and Medicine 58 (March 2015), 46--55.
[33]
Daehwa Kim, Keunwoo Park, and Geehyuk Lee. 2021. AtaTouch: Robust Finger Pinch Detection for a VR Controller Using RF Return Loss. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems. ACM, Yokohama Japan, 1--9.
[34]
Myung Jin Kim and Andrea Bianchi. 2021. Exploring Pseudo Hand-Eye Interaction on the Head-Mounted Display. In Proceedings of the Augmented Humans International Conference 2021 (AHs '21). Association for Computing Machinery, New York, NY, USA, 251--258.
[35]
Diederik P Kingma and Jimmy Ba. 2014. Adam: A method for stochastic optimization. arXiv preprint arXiv:1412.6980 (2014).
[36]
Pin-Sung Ku, Qijia Shao, Te-Yen Wu, Jun Gong, Ziyan Zhu, Xia Zhou, and Xing-Dong Yang. 2020. ThreadSense: Locating Touch on an Extremely Thin Interactive Thread. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems. ACM, Honolulu HI USA, 1--12.
[37]
Pin-Sung Ku, Te-Yan Wu, and Mike Y. Chen. 2017. EyeExpression: exploring the use of eye expressions as hands-free input for virtual and augmented reality devices. In Proceedings of the 23rd ACM Symposium on Virtual Reality Software and Technology. ACM, Gothenburg Sweden, 1--2.
[38]
Pin-Sung Ku, Te-Yen Wu, and Mike Y. Chen. 2018. EyeExpress: Expanding Hands-free Input Vocabulary using Eye Expressions. In The 31st Annual ACM Symposium on User Interface Software and Technology Adjunct Proceedings. ACM, Berlin Germany, 126--127.
[39]
Juyoung Lee, Hui-Shyong Yeo, Murtaza Dhuliawala, Jedidiah Akano, Junichi Shimizu, Thad Starner, Aaron Quigley, Woontack Woo, and Kai Kunze. 2017. Itchy nose: discreet gesture interaction using EOG sensors in smart eyewear. In Proceedings of the 2017 ACM International Symposium on Wearable Computers (ISWC '17). Association for Computing Machinery, New York, NY, USA, 94--97.
[40]
Jansen C. Liando, Amalinda Gamage, Agustinus W. Tengourtius, and Mo Li. 2019. Known and Unknown Facts of LoRa: Experiences from a Large-Scale Measurement Study. ACM Trans. Sen. Netw. 15, 2, Article 16 (feb 2019), 35 pages.
[41]
Ji Lin, Ligeng Zhu, Wei-Ming Chen, Wei-Chen Wang, Chuang Gan, and Song Han. 2022. On-Device Training Under 256KB Memory. arXiv:2206.15472 [cs.CV]
[42]
Sikun Lin, Hao Fei Cheng, Weikai Li, Zhanpeng Huang, Pan Hui, and Christoph Peylo. 2017. Ubii: Physical World Interaction Through Augmented Reality. IEEE Transactions on Mobile Computing 16, 3 (March 2017), 872--885. Conference Name: IEEE Transactions on Mobile Computing.
[43]
Peng Liu, Weilun Wang, Ying Chen, Xing Feng, and Lixin Miao. 2017. Concrete damage diagnosis using electromechanical impedance technique. Construction and Building Materials 136 (April 2017), 450--455.
[44]
Ifana Mahbub, Salvatore Andrea Pullano, Hanfeng Wang, Syed Kamrul Islam, Antonino S. Fiorillo, Gary To, and M. R. Mahfouz. 2017. A Low-Power Wireless Piezoelectric Sensor-Based Respiration Monitoring System Realized in CMOS Process. IEEE Sensors Journal 17, 6 (March 2017), 1858--1864. Conference Name: IEEE Sensors Journal.
[45]
Héctor A. Cordourier Maruri, Paulo Lopez-Meyer, Jonathan Huang, Willem Marco Beltman, Lama Nachman, and Hong Lu. 2018. V-Speech: Noise-Robust Speech Capturing Glasses Using Vibration Sensors. Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies 2, 4 (Dec. 2018), 180:1--180:23.
[46]
Katsutoshi Masai, Kai Kunze, Daisuke Sakamoto, Yuta Sugiura, and Maki Sugimoto. 2020. Face Commands - User-Defined Facial Gestures for Smart Glasses. In 2020 IEEE International Symposium on Mixed and Augmented Reality (ISMAR). 374--386. ISSN: 1554-7868.
[47]
Katsutoshi Masai, Kai Kunze, and Maki Sugimoto. 2020. Eye-based Interaction Using Embedded Optical Sensors on an Eyewear Device for Facial Expression Recognition. In Proceedings of the Augmented Humans International Conference (AHs '20). Association for Computing Machinery, New York, NY, USA, 1--10.
[48]
Katsutoshi Masai, Yuta Sugiura, and Maki Sugimoto. 2018. FaceRubbing: Input Technique by Rubbing Face using Optical Sensors on Smart Eyewear for Facial Expression Recognition. In Proceedings of the 9th Augmented Human International Conference (AH '18). Association for Computing Machinery, New York, NY, USA, 1--5.
[49]
Denys J.C. Matthies, Alex Woodall, and Bodo Urban. 2021. Prototyping Smart Eyewear with Capacitive Sensing for Facial and Head Gesture Detection. In Adjunct Proceedings of the 2021 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2021 ACM International Symposium on Wearable Computers. ACM, Virtual USA, 476--480.
[50]
Jiyoung Min, Seunghee Park, Chung-Bang Yun, Chang-Geun Lee, and Changgil Lee. 2012. Impedance-based structural health monitoring incorporating neural network technique for identification of damage type and severity. Engineering Structures 39 (June 2012), 210--220.
[51]
Roderick Murray-Smith, John Williamson, Stephen Hughes, and Torben Quaade. 2008. Stane: synthesized surfaces for tactile input. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '08). Association for Computing Machinery, New York, NY, USA, 1299--1302.
[52]
Donald A. Norman. 2002. The design of everyday things. Basic Books, [New York].
[53]
Tomi Nukarinen, Jari Kangas, Oleg Špakov, Poika Isokoski, Deepak Akkil, Jussi Rantala, and Roope Raisamo. 2016. Evaluation of HeadTurn: An Interaction Technique Using the Gaze and Head Turns. In Proceedings of the 9th Nordic Conference on Human-Computer Interaction (NordiCHI '16). Association for Computing Machinery, New York, NY, USA, 1--8.
[54]
Makoto Ono, Buntarou Shizuki, and Jiro Tanaka. 2013. Touch & activate: adding interactivity to existing objects using active acoustic sensing. In Proceedings of the 26th annual ACM symposium on User interface software and technology (UIST '13). Association for Computing Machinery, New York, NY, USA, 31--40.
[55]
Luis Paredes, Ananya Ipsita, Juan C. Mesa, Ramses V. Martinez Garrido, and Karthik Ramani. 2022. StretchAR: Exploiting Touch and Stretch as a Method of Interaction for Smart Glasses Using Wearable Straps. Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies 6, 3 (Sept. 2022), 134:1--134:26.
[56]
Dae Yong Park, Daniel J. Joe, Dong Hyun Kim, Hyewon Park, Jae Hyun Han, Chang Kyu Jeong, Hyelim Park, Jung Gyu Park, Boyoung Joung, and Keon Jae Lee. 2017. Self-Powered Real-Time Arterial Pulse Monitoring Using Ultrathin Epidermal Piezoelectric Sensors. Advanced Materials 29, 37 (2017), 1702308.
[57]
Gyuhae Park and Daniel J Inman. 2006. Structural health monitoring using piezoelectric impedance measurements. Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences 365, 1851 (Dec. 2006), 373--392. Publisher: Royal Society.
[58]
Chaorui Qiu, Bo Wang, Nan Zhang, Shujun Zhang, Jinfeng Liu, David Walker, Yu Wang, Hao Tian, Thomas R Shrout, Zhuo Xu, et al. 2020. Transparent ferroelectric crystals with ultrahigh piezoelectricity. Nature 577, 7790 (2020), 350--354.
[59]
Tiago Rodrigues-Marinho, Nelson Pereira, Vitor Correia, Daniel Miranda, Senentxu Lanceros-Méndez, and Pedro Costa. 2022. Transparent Piezoelectric Polymer-Based Materials for Energy Harvesting and Multitouch Detection Devices. ACS Applied Electronic Materials 4, 1 (2022), 287--296.
[60]
Soha Rostaminia, Alexander Lamson, Subhransu Maji, Tauhidur Rahman, and Deepak Ganesan. 2019. W!NCE: Unobtrusive Sensing of Upper Facial Action Units with EOG-based Eyewear. Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies 3, 1 (March 2019), 1--26.
[61]
Munehiko Sato, Ivan Poupyrev, and Chris Harrison. 2012. Touché: enhancing touch interaction on humans, screens, liquids, and everyday objects. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. ACM, Austin Texas USA, 483--492.
[62]
Munehiko Sato, Rohan S. Puri, Alex Olwal, Yosuke Ushigome, Lukas Franciszkiewicz, Deepak Chandra, Ivan Poupyrev, and Ramesh Raskar. 2017. Zensei: Embedded, Multi-electrode Bioimpedance Sensing for Implicit, Ubiquitous User Recognition. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems. ACM, Denver Colorado USA, 3972--3985.
[63]
Chen-Hsuan (Iris) Shih, Naofumi Tomita, Yanick X. Lukic, Álvaro Hernández Reguera, Elgar Fleisch, and Tobias Kowatsch. 2019. Breeze: Smartphone-based Acoustic Real-time Detection of Breathing Phases for a Gamified Biofeedback Breathing Training. Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies 3, 4 (Dec. 2019), 1--30.
[64]
Jaemin Shin, Seungjoo Lee, Taesik Gong, Hyungjun Yoon, Hyunchul Roh, Andrea Bianchi, and Sung-Ju Lee. 2022. MyDJ: Sensing Food Intakes with an Attachable on Your Eyeglass Frame. In CHI Conference on Human Factors in Computing Systems. ACM, New Orleans LA USA, 1--17.
[65]
Asfand Tanwear, Xiangpeng Liang, Yuchi Liu, Aleksandra Vuckovic, Rami Ghannam, Tim Böhnert, Elvira Paz, Paulo P. Freitas, Ricardo Ferreira, and Hadi Heidari. 2020. Spintronic Sensors Based on Magnetic Tunnel Junctions for Wireless Eye Movement Gesture Control. IEEE Transactions on Biomedical Circuits and Systems 14, 6 (Dec. 2020), 1299--1310.
[66]
Wen-Jie Tseng, Li-Yang Wang, and Liwei Chan. 2019. FaceWidgets: Exploring Tangible Interaction on Face with Head-Mounted Displays. In Proceedings of the 32nd Annual ACM Symposium on User Interface Software and Technology (UIST '19). Association for Computing Machinery, New York, NY, USA, 417--427.
[67]
Yu-Chih Tung and Kang G. Shin. 2015. EchoTag: Accurate Infrastructure-Free Indoor Location Tagging with Smartphones. In Proceedings of the 21st Annual International Conference on Mobile Computing and Networking - MobiCom '15. ACM Press, Paris, France, 525--536.
[68]
Ashish Vaswani, Noam Shazeer, Niki Parmar, Jakob Uszkoreit, Llion Jones, Aidan N Gomez, Łukasz Kaiser, and Illia Polosukhin. 2017. Attention is all you need. Advances in neural information processing systems 30 (2017).
[69]
Yueting Weng, Chun Yu, Yingtian Shi, Yuhang Zhao, Yukang Yan, and Yuanchun Shi. 2021. FaceSight: Enabling Hand-to-Face Gesture Interaction on AR Glasses with a Downward-Facing Camera Vision. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems (CHI '21). Association for Computing Machinery, New York, NY, USA, 1--14.
[70]
Wentao Xie, Jin Zhang, and Qian Zhang. 2022. Transforming eyeglass rim into touch panel using piezoelectric sensors. In Proceedings of the 28th Annual International Conference on Mobile Computing And Networking. ACM, Sydney NSW Australia, 838--840.
[71]
Wentao Xie, Qian Zhang, and Jin Zhang. 2021. Acoustic-based Upper Facial Action Recognition for Smart Eyewear. Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies 5, 2 (June 2021), 1--28.
[72]
Kaiqiang Xu, Xinchen Wan, Hao Wang, Zhenghang Ren, Xudong Liao, Decang Sun, Chaoliang Zeng, and Kai Chen. 2021. TACC: A Full-stack Cloud Computing Infrastructure for Machine Learning Tasks. arXiv preprint arXiv:2110.01556 (2021).
[73]
Xuhai Xu, Jun Gong, Carolina Brum, Lilian Liang, Bongsoo Suh, Shivam Kumar Gupta, Yash Agarwal, Laurence Lindsey, Runchang Kang, Behrooz Shahsavari, Tu Nguyen, Heriberto Nieto, Scott E Hudson, Charlie Maalouf, Jax Seyed Mousavi, and Gierad Laput. 2022. Enabling Hand Gesture Customization on Wrist-Worn Devices. In CHI Conference on Human Factors in Computing Systems. ACM, New Orleans LA USA, 1--19.
[74]
Koki Yamashita, Takashi Kikuchi, Katsutoshi Masai, Maki Sugimoto, Bruce H. Thomas, and Yuta Sugiura. 2017. CheekInput: turning your cheek into an input surface by embedded optical sensors on a head-mounted display. In Proceedings of the 23rd ACM Symposium on Virtual Reality Software and Technology (VRST '17). Association for Computing Machinery, New York, NY, USA, 1--8.
[75]
Yukang Yan, Chun Yu, Xin Yi, and Yuanchun Shi. 2018. HeadGesture: Hands-Free Input Approach Leveraging Head Movements for HMD Devices. Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies 2, 4 (Dec. 2018), 198:1--198:23.
[76]
Hui-Shyong Yeo, Juyoung Lee, Woontack Woo, Hideki Koike, Aaron J Quigley, and Kai Kunze. 2021. JINSense: Repurposing Electrooculography Sensors on Smart Glass for Midair Gesture and Context Sensing. In Extended Abstracts of the 2021 CHI Conference on Human Factors in Computing Systems (CHI EA '21). Association for Computing Machinery, New York, NY, USA, 1--6.
[77]
Shanhe Yi, Zhengrui Qin, Ed Novak, Yafeng Yin, and Qun Li. 2016. GlassGesture: Exploring head gesture interface of smart glasses. In IEEE INFOCOM 2016 - The 35th Annual IEEE International Conference on Computer Communications. 1--9.
[78]
Zhiran Yi, Wenming Zhang, and Bin Yang. 2022. Piezoelectric approaches for wearable continuous blood pressure monitoring: a review. Journal of Micromechanics and Microengineering 32, 10 (Aug. 2022), 103003. Publisher: IOP Publishing.
[79]
Yuzhou Zhuang, Yuntao Wang, Yukang Yan, Xuhai Xu, and Yuanchun Shi. 2021. ReflecTrack: Enabling 3D Acoustic Position Tracking Using Commodity Dual-Microphone Smartphones. In The 34th Annual ACM Symposium on User Interface Software and Technology. ACM, Virtual Event USA, 1050--1062.

Cited By

View all
  • (2024)EyeGesener: Eye Gesture Listener for Smart Glasses Interaction Using Acoustic SensingProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36785418:3(1-28)Online publication date: 9-Sep-2024
  • (2024)Functional Now, Wearable Later: Examining the Design Practices of Wearable TechnologistsProceedings of the 2024 ACM International Symposium on Wearable Computers10.1145/3675095.3676615(71-81)Online publication date: 5-Oct-2024

Recommendations

Comments

Information & Contributors

Information

Published In

cover image Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies
Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies  Volume 7, Issue 4
December 2023
1613 pages
EISSN:2474-9567
DOI:10.1145/3640795
Issue’s Table of Contents
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 12 January 2024
Published in IMWUT Volume 7, Issue 4

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. eyewear
  2. interaction
  3. piezoelectric sensor
  4. touch gesture

Qualifiers

  • Research-article
  • Research
  • Refereed

Funding Sources

  • Hong Kong RGC
  • Shenzhen Science and Technology Program

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)516
  • Downloads (Last 6 weeks)32
Reflects downloads up to 20 Jan 2025

Other Metrics

Citations

Cited By

View all
  • (2024)EyeGesener: Eye Gesture Listener for Smart Glasses Interaction Using Acoustic SensingProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36785418:3(1-28)Online publication date: 9-Sep-2024
  • (2024)Functional Now, Wearable Later: Examining the Design Practices of Wearable TechnologistsProceedings of the 2024 ACM International Symposium on Wearable Computers10.1145/3675095.3676615(71-81)Online publication date: 5-Oct-2024

View Options

Login options

Full Access

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media