skip to main content
10.1145/3544794.3558460acmconferencesArticle/Chapter ViewAbstractPublication PagesubicompConference Proceedingsconference-collections
short-paper
Public Access

Virtual IMU Data Augmentation by Spring-Joint Model for Motion Exercises Recognition without Using Real Data

Published: 27 December 2022 Publication History

Abstract

A conventional motion exercises recognition system only tracks designated motion types, and it enables users cannot use a customized system according to personal needs. The virtual IMU data provides a new opportunity to reduce the cost of training datasets and flexibly design the activity recognition system using online resources. To better design a user-customized motion exercises recognition system using virtual IMU data, this paper proposes a virtual IMU sensor module with a spring-joint model to augment the virtual acceleration signal from the limited online 2D video. The original virtual acceleration signal is extended with data from different acceleration distributions generated by the spring-joint model and used to train a motion exercises recognition system. The proposed method can design a classifier for three motions with limited video resources, showing an average accuracy of 85.5 on the real motion data of seven individuals.

References

[1]
Apple. 2022. Apple Health. https://www.apple.com/ios/health/.
[2]
Blender. 2022. Blender. https://www.blender.org/.
[3]
DeepMotion. 2022. DeepMotion. https://www.deepmotion.com/.
[4]
Paul Dempsey. 2015. The teardown: Apple Watch. Engineering & Technology 10, 6 (2015), 88–89.
[5]
Ryan S Falck, Jennifer C Davis, John R Best, Rachel A Crockett, and Teresa Liu-Ambrose. 2019. Impact of exercise training on physical and cognitive function among older adults: a systematic review and meta-analysis. Neurobiology of aging 79(2019), 119–130.
[6]
Andrea Ferlini, Alessandro Montanari, Cecilia Mascolo, and Robert Harle. 2019. Head motion tracking through in-ear wearables. In Proceedings of the 1st International Workshop on Earable Computing. 8–13.
[7]
Cholmin Kang, Hyunwoo Jung, and Youngki Lee. 2019. Towards machine learning with zero real-world data. In The 5th ACM Workshop on Wearable Systems and Applications. 41–46.
[8]
Hyeokhyen Kwon, Gregory D Abowd, and Thomas Plötz. 2021. Complex Deep Neural Networks from Large Scale Virtual IMU Data for Effective Human Activity Recognition Using Wearables. Sensors 21, 24 (2021), 8337.
[9]
Hyeokhyen Kwon, Catherine Tong, Harish Haresamudram, Yan Gao, Gregory D Abowd, Nicholas D Lane, and Thomas Ploetz. 2020. IMUTube: Automatic extraction of virtual on-body accelerometry from video for human activity recognition. Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies 4, 3 (2020), 1–29.
[10]
Arttu Lämsä, Jaakko Tervonen, Jussi Liikka, Constantino Álvarez Casado, and Miguel Bordallo López. 2022. Video2IMU: Realistic IMU features and signals from videos. arXiv preprint arXiv:2202.06547(2022).
[11]
Xi’ang Li, Jinqi Luo, and Rabih Younes. 2020. ActivityGAN: Generative adversarial networks for data augmentation in sensor-based human activity recognition. In Adjunct Proceedings of the 2020 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2020 ACM International Symposium on Wearable Computers. 249–254.
[12]
Yilin Liu, Shijia Zhang, and Mahanth Gowda. 2021. When video meets inertial sensors: zero-shot domain adaptation for finger motion analytics with inertial sensors. In Proceedings of the International Conference on Internet-of-Things Design and Implementation. 182–194.
[13]
Nobuyuki Oishi, Benedetta Heimler, Lloyd Pellatt, Meir Plotnik, and Daniel Roggen. 2021. Detecting Freezing of Gait with Earables Trained from VR Motion Capture Data. In 2021 International Symposium on Wearable Computers. 33–37.
[14]
Dario Pavllo, Christoph Feichtenhofer, David Grangier, and Michael Auli. 2019. 3d human pose estimation in video with temporal convolutions and semi-supervised training. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition. 7753–7762.
[15]
PhysKit. 2022. PhysKit. https://www.heathen.group/physkit.
[16]
Vitor Fortes Rey, Peter Hevesi, Onorina Kovalenko, and Paul Lukowicz. 2019. Let there be IMU data: Generating training data for wearable, motion sensor based activity recognition from monocular rgb videos. In Adjunct proceedings of the 2019 ACM international joint conference on pervasive and ubiquitous computing and proceedings of the 2019 ACM international symposium on wearable computers. 699–708.
[17]
Nana Schlage, Andreas Kitzig, Gudrun Stockmanns, and Edwin Naroska. 2021. Development of a mobile, cost-effective and easy to use inertial motion capture system for monitoring in rehabilitation applications. Current Directions in Biomedical Engineering 7, 2(2021), 586–589.
[18]
Marcus Schmidt, Carl Christian Rheinländer, Sebastian Wille, Norbert Wehn, and Thomas Jaitner. 2016. IMU-based determination of fatigue during long sprint. In Proceedings of the 2016 ACM international joint conference on pervasive and ubiquitous computing: Adjunct. 899–903.
[19]
Shingo Takeda, Tsuyoshi Okita, Paula Lago, and Sozo Inoue. 2018. A multi-sensor setting activity recognition simulation tool. In Proceedings of the 2018 ACM International Joint Conference and 2018 International Symposium on Pervasive and Ubiquitous Computing and Wearable Computers. 1444–1448.
[20]
Terry T Um, Franz MJ Pfister, Daniel Pichler, Satoshi Endo, Muriel Lang, Sandra Hirche, Urban Fietzek, and Dana Kulić. 2017. Data augmentation of wearable sensor data for parkinson’s disease monitoring using convolutional neural networks. In Proceedings of the 19th ACM international conference on multimodal interaction. 216–220.
[21]
Chengshuo Xia, Ayane Saito, and Yuta Sugiura. 2022. Using the virtual data-driven measurement to support the prototyping of hand gesture recognition interface with distance sensor. Sensors and Actuators A: Physical 338 (2022), 113463.
[22]
Binbin Yong, Zijian Xu, Xin Wang, Libin Cheng, Xue Li, Xiang Wu, and Qingguo Zhou. 2018. IoT-based intelligent fitness system. J. Parallel and Distrib. Comput. 118 (2018), 14–21.
[23]
Alexander D Young, Martin J Ling, and Damal K Arvind. 2011. IMUSim: A simulation environment for inertial sensing algorithm design and evaluation. In Proceedings of the 10th ACM/IEEE International Conference on Information Processing in Sensor Networks. IEEE, 199–210.
[24]
Shibo Zhang and Nabil Alshurafa. 2020. Deep generative cross-modal on-body accelerometer data synthesis from videos. In Adjunct Proceedings of the 2020 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2020 ACM International Symposium on Wearable Computers. 223–227.

Cited By

View all
  • (2024)PressInPose: Integrating Pressure and Inertial Sensors for Full-Body Pose Estimation in ActivitiesProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36997738:4(1-28)Online publication date: 21-Nov-2024
  • (2024)IMUGPT 2.0: Language-Based Cross Modality Transfer for Sensor-Based Human Activity RecognitionProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36785458:3(1-32)Online publication date: 9-Sep-2024
  • (2024)More Data for People with Disabilities! Comparing Data Collection Efforts for Wheelchair Transportation Mode DetectionProceedings of the 2024 ACM International Symposium on Wearable Computers10.1145/3675095.3676617(82-88)Online publication date: 5-Oct-2024
  • Show More Cited By

Index Terms

  1. Virtual IMU Data Augmentation by Spring-Joint Model for Motion Exercises Recognition without Using Real Data

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    ISWC '22: Proceedings of the 2022 ACM International Symposium on Wearable Computers
    September 2022
    141 pages
    ISBN:9781450394246
    DOI:10.1145/3544794
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 27 December 2022

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. Data Augmentation
    2. Motion Exercises Recognition
    3. Virtual IMU

    Qualifiers

    • Short-paper
    • Research
    • Refereed limited

    Funding Sources

    Conference

    UbiComp/ISWC '22

    Acceptance Rates

    Overall Acceptance Rate 38 of 196 submissions, 19%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)645
    • Downloads (Last 6 weeks)84
    Reflects downloads up to 25 Feb 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)PressInPose: Integrating Pressure and Inertial Sensors for Full-Body Pose Estimation in ActivitiesProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36997738:4(1-28)Online publication date: 21-Nov-2024
    • (2024)IMUGPT 2.0: Language-Based Cross Modality Transfer for Sensor-Based Human Activity RecognitionProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36785458:3(1-32)Online publication date: 9-Sep-2024
    • (2024)More Data for People with Disabilities! Comparing Data Collection Efforts for Wheelchair Transportation Mode DetectionProceedings of the 2024 ACM International Symposium on Wearable Computers10.1145/3675095.3676617(82-88)Online publication date: 5-Oct-2024
    • (2024)ModifyAug: Data Augmentation for Virtual IMU Signal based on 3D Motion Modification Used for Real Activity RecognitionExtended Abstracts of the CHI Conference on Human Factors in Computing Systems10.1145/3613905.3650806(1-7)Online publication date: 11-May-2024
    • (2024)Locally Controllable Attitude B-Spline FunctionIEEE Access10.1109/ACCESS.2024.348817212(161155-161163)Online publication date: 2024
    • (2024)Sensor Data Augmentation from Skeleton Pose Sequences for Improving Human Activity Recognition2024 International Conference on Activity and Behavior Computing (ABC)10.1109/ABC61795.2024.10652200(1-8)Online publication date: 29-May-2024
    • (2023)On the Utility of Virtual On-body Acceleration Data for Fine-grained Human Activity RecognitionProceedings of the 2023 ACM International Symposium on Wearable Computers10.1145/3594738.3611364(55-59)Online publication date: 8-Oct-2023
    • (2023)Generating Virtual On-body Accelerometer Data from Virtual Textual Descriptions for Human Activity RecognitionProceedings of the 2023 ACM International Symposium on Wearable Computers10.1145/3594738.3611361(39-43)Online publication date: 8-Oct-2023
    • (2023)If only we had more data!: Sensor-Based Human Activity Recognition in Challenging Scenarios2023 IEEE International Conference on Pervasive Computing and Communications Workshops and other Affiliated Events (PerCom Workshops)10.1109/PerComWorkshops56833.2023.10150267(565-570)Online publication date: 13-Mar-2023

    View Options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    HTML Format

    View this article in HTML Format.

    HTML Format

    Login options

    Figures

    Tables

    Media

    Share

    Share

    Share this Publication link

    Share on social media