skip to main content
research-article

Acoustic Strength-based Motion Tracking

Authors Info & Claims
Published:18 December 2020Publication History
Skip Abstract Section

Abstract

Accurate device motion tracking enables many applications like Virtual Reality (VR) and Augmented Reality (AR). To make these applications available in people's daily life, low-cost acoustic-based motion tracking methods are proposed. However, existing acoustic-based methods are all based on distance estimation. These methods measure the distance between a speaker and a microphone. With a speaker or microphone array, it can get multiple estimated distances and further achieve multidimensional motion tracking. The weakness of distance-based motion tracking methods is that they need large array size to get accurate results. Some systems even require an array larger than 1 m. This weakness limits the adoption of existing solutions in a single device like a smart speaker. To solve this problem, we propose Acoustic Strength-based Angle Tracking (ASAT) System and further implement a motion tracking system based on ASAT. ASAT achieves angle tracking by creating a periodically changing sound field. A device with a microphone will sense the periodically changing sound strength in the sound field. When the device moves, the period of received sound strength will change. Thus we can derive the angle change and achieve angle tracking. The ASAT-based system can obtain the localization accuracy as 5 cm when the distance between the speaker and the microphone is in the range of 3 m.

References

  1. 2020. HTC VIVE. https://www.vive.com/. Accessed May 15, 2020.Google ScholarGoogle Scholar
  2. Raúl Feliz Alonso, Eduardo Zalama Casanova, and Jaime Gómez García-Bermejo. 2009. Pedestrian tracking using inertial sensors. (2009).Google ScholarGoogle Scholar
  3. Hassen Fourati. 2014. Heterogeneous data fusion algorithm for pedestrian navigation via foot-mounted inertial measurement unit and complementary filter. IEEE Transactions on Instrumentation and Measurement 64, 1 (2014), 221--229.Google ScholarGoogle ScholarCross RefCross Ref
  4. Wenchao Huang, Yan Xiong, Xiang-Yang Li, Hao Lin, Xufei Mao, Panlong Yang, and Yunhao Liu. 2014. Shake and walk: Acoustic direction finding and fine-grained indoor localization using smartphones. In IEEE INFOCOM 2014-IEEE Conference on Computer Communications. IEEE, 370--378.Google ScholarGoogle ScholarCross RefCross Ref
  5. Yang Liu, Wuxiong Zhang, Yang Yang, Weidong Fang, Fei Qin, and Xuewu Dai. 2019. PAMT: Phase-based acoustic motion tracking in multipath fading environments. In IEEE INFOCOM 2019-IEEE Conference on Computer Communications. IEEE, 2386--2394.Google ScholarGoogle ScholarCross RefCross Ref
  6. Wenguang Mao, Jian He, and Lili Qiu. 2016. CAT: high-precision acoustic motion tracking. In Proceedings of the 22nd Annual International Conference on Mobile Computing and Networking. ACM, 69--81.Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. Rajalakshmi Nandakumar, Vikram Iyer, Desney Tan, and Shyamnath Gollakota. 2016. Fingerio: Using active sonar for fine-grained finger tracking. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems. 1515--1525.Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. Nissanka B Priyantha, Anit Chakraborty, and Hari Balakrishnan. 2000. The cricket location-support system. In Proceedings of the 6th annual international conference on Mobile computing and networking. 32--43.Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. Li Sun, Souvik Sen, Dimitrios Koutsonikolas, and Kyu-Han Kim. 2015. Widraw: Enabling hands-free drawing in the air on commodity wifi devices. In Proceedings of the 21st Annual International Conference on Mobile Computing and Networking. 77--89.Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. Anran Wang and Shyamnath Gollakota. 2019. MilliSonic: Pushing the Limits of Acoustic Motion Tracking. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems. ACM, 18.Google ScholarGoogle ScholarDigital LibraryDigital Library
  11. Jue Wang, Deepak Vasisht, and Dina Katabi. 2014. RF-IDraw: virtual touch screen in the air using RF signals. ACM SIGCOMM Computer Communication Review 44, 4 (2014), 235--246.Google ScholarGoogle ScholarDigital LibraryDigital Library
  12. Wei Wang, Alex X Liu, and Ke Sun. 2016. Device-free gesture tracking using acoustic signals. In Proceedings of the 22nd Annual International Conference on Mobile Computing and Networking. 82--94.Google ScholarGoogle Scholar
  13. Jie Xiong and Kyle Jamieson. 2013. Arraytrack: A fine-grained indoor location system. In Presented as part of the 10th {USENIX} Symposium on Networked Systems Design and Implementation ({NSDI} 13). 71--84.Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. Lei Yang, Yekui Chen, Xiang-Yang Li, Chaowei Xiao, Mo Li, and Yunhao Liu. 2014. Tagoram: Real-time tracking of mobile RFID tags to high precision using COTS devices. In Proceedings of the 20th annual international conference on Mobile computing and networking. 237--248.Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. Sangki Yun, Yi-Chao Chen, and Lili Qiu. 2015. Turning a mobile device into a mouse in the air. In Proceedings of the 13th Annual International Conference on Mobile Systems, Applications, and Services. 15--29.Google ScholarGoogle ScholarDigital LibraryDigital Library
  16. Sangki Yun, Yi-Chao Chen, Huihuang Zheng, Lili Qiu, and Wenguang Mao. 2017. Strata: Fine-grained acoustic-based device-free tracking. In Proceedings of the 15th annual international conference on mobile systems, applications, and services. 15--28.Google ScholarGoogle ScholarDigital LibraryDigital Library
  17. Cheng Zhang, Qiuyue Xue, Anandghan Waghmare, Sumeet Jain, Yiming Pu, Sinan Hersek, Kent Lyons, Kenneth A Cunefare, Omer T Inan, and Gregory D Abowd. 2017. Soundtrak: Continuous 3d tracking of a finger using active acoustics. Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies 1, 2 (2017), 30.Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. Acoustic Strength-based Motion Tracking

        Recommendations

        Comments

        Login options

        Check if you have access through your login credentials or your institution to get full access on this article.

        Sign in

        Full Access

        • Published in

          cover image Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies
          Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies  Volume 4, Issue 4
          December 2020
          1356 pages
          EISSN:2474-9567
          DOI:10.1145/3444864
          Issue’s Table of Contents

          Copyright © 2020 ACM

          Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

          Publisher

          Association for Computing Machinery

          New York, NY, United States

          Publication History

          • Published: 18 December 2020
          Published in imwut Volume 4, Issue 4

          Permissions

          Request permissions about this article.

          Request Permissions

          Check for updates

          Qualifiers

          • research-article
          • Research
          • Refereed

        PDF Format

        View or Download as a PDF file.

        PDF

        eReader

        View online with eReader.

        eReader