skip to main content
research-article

Deep Ensemble Learning for Human Activity Recognition Using Wearable Sensors via Filter Activation

Published: 29 October 2022 Publication History

Abstract

During the past decade, human activity recognition (HAR) using wearable sensors has become a new research hot spot due to its extensive use in various application domains such as healthcare, fitness, smart homes, and eldercare. Deep neural networks, especially convolutional neural networks (CNNs), have gained a lot of attention in HAR scenario. Despite exceptional performance, CNNs with heavy overhead is not the best option for HAR task due to the limitation of computing resource on embedded devices. As far as we know, there are many invalid filters in CNN that contribute very little to output. Simply pruning these invalid filters could effectively accelerate CNNs, but it inevitably hurts performance. In this article, we first propose a novel CNN for HAR that uses filter activation. In comparison with filter pruning that is motivated for efficient consideration, filter activation aims to activate these invalid filters from an accuracy boosting perspective. We perform extensive experiments on several public HAR datasets, namely, UCI-HAR (UCI), OPPORTUNITY (OPPO), UniMiB-SHAR (Uni), PAMAP2 (PAM2), WISDM (WIS), and USC-HAD (USC), which show the superiority of the proposed method against existing state-of-the-art (SOTA) approaches. Ablation studies are conducted to analyze its internal mechanism. Finally, the inference speed and power consumption are evaluated on an embedded Raspberry Pi Model 3 B plus platform.

References

[1]
Ganapati Bhat, Yigit Tuncel, Sizhe An, Hyung Gyu Lee, and Umit Y. Ogras. 2019. An ultra-low energy human activity recognition accelerator for wearable health applications. ACM Trans. Embed. Comput. Syst. 18, 5s (2019), 1–22.
[2]
Bing Li, Wei Cui, Wei Wang, Le Zhang, Zhenghua Chen, and Min Wu. 2021. Two-stream convolution augmented transformer for human activity recognition. In Proceedings of the AAAI Conference on Artificial Intelligence. 286–293.
[3]
Yi Zhang, Zheng Yang, Guidong Zhang, Chenshu Wu, and Li Zhang. 2021. XGest: Enabling cross-label gesture recognition with RF signals. ACM Trans. Sensor Netw. 17, 4 (2021), 1–23.
[4]
Ali Akbari, Jonathan Martinez, and Roozbeh Jafari. 2021. Facilitating human activity data annotation via context-aware change detection on smartwatches. ACM Trans. Embed. Comput. Syst. 20, 2 (2021), 1–20.
[5]
Federico Concone, Giuseppe Lo Re, and Marco Morana. 2019. A fog-based application for human activity recognition using personal smart devices. ACM Trans. Internet Technol. 19, 2 (2019), 1–20.
[6]
Wenbo Huang, Lei Zhang, Hao Wu, Fuhong Min, and Aiguo Song. 2022. Channel-Equalization-HAR: A light-weight convolutional neural network for wearable sensor based human activity recognition. IEEE Trans. Mob. Comput. Early access. DOI:
[7]
Xin Cheng, Lei Zhang, Yin Tang, Yue Liu, Hao Wu, and Jun He. 2022. Real-time human activity recognition using conditionally parametrized convolutions on mobile and wearable devices. IEEE Sensors J. 22, 6 (2022), 5889–5901.
[8]
Xinyu Li, Yuan He, Francesco Fioranelli, and Xiaojun Jing. 2021. Semisupervised human activity recognition with radar micro-Doppler signatures. IEEE Trans. Geosci. Rem. Sens. 60 (2021), 1–12.
[9]
Wenbo Huang, Lei Zhang, Wenbin Gao, Fuhong Min, and Jun He. 2021. Shallow convolutional neural networks for human activity recognition using wearable sensors. IEEE Trans. Instrum. Measur. 70 (2021), 1–11.
[10]
Yann LeCun, Yoshua Bengio, and Geoffrey Hinton. 2015. Deep learning. Nature 521, 7553 (2015), 436–444.
[11]
Zewen Li, Fan Liu, Wenjie Yang, Shouheng Peng, and Jun Zhou. 2021. A survey of convolutional neural networks: Analysis, applications, and prospects. IEEE Trans. Neural Netw. Learn. Syst. Early access. DOI:
[12]
Jiuxiang Gu, Zhenhua Wang, Jason Kuen, Lianyang Ma, Amir Shahroudy, Bing Shuai, Ting Liu, Xingxing Wang, Gang Wang, Jianfei Cai, et al. 2018. Recent advances in convolutional neural networks. Pattern Recog. 77 (2018), 354–377.
[13]
Yonglong Tian, Guang-He Lee, Hao He, Chen-Yu Hsu, and Dina Katabi. 2018. RF-based fall monitoring using convolutional neural networks. Proc. ACM Interact., Mob., Wear. Ubiq. Technol. 2, 3 (2018), 1–24.
[14]
Ming Zeng, Haoxiang Gao, Tong Yu, Ole J. Mengshoel, Helge Langseth, Ian Lane, and Xiaobing Liu. 2018. Understanding and improving recurrent networks for human activity recognition by continuous attention. In Proceedings of the ACM International Symposium on Wearable Computers. 56–63.
[15]
Charissa Ann Ronao and Sung-Bae Cho. 2016. Human activity recognition with smartphone sensors using deep learning neural networks. Exp. Syst. Applic. 59 (2016), 235–244.
[16]
Jianbo Yang, Minh Nhut Nguyen, Phyo Phyo San, Xiao Li Li, and Shonali Krishnaswamy. 2015. Deep convolutional neural networks on multichannel time series for human activity recognition. In Proceedings of the 24th International Joint Conference on Artificial Intelligence.
[17]
Daniele Ravi, Charence Wong, Benny Lo, and Guang-Zhong Yang. 2016. Deep learning for human activity recognition: A resource efficient implementation on low-power devices. In Proceedings of the IEEE 13th International Conference on Wearable and Implantable Body Sensor Networks (BSN). IEEE, 71–76.
[18]
Wenchao Jiang and Zhaozheng Yin. 2015. Human activity recognition using wearable sensors by deep convolutional neural networks. In Proceedings of the 23rd ACM International Conference on Multimedia. 1307–1310.
[19]
Eunji Kim. 2020. Interpretable and accurate convolutional neural networks for human activity recognition. IEEE Trans. Industr. Inform. 16, 11 (2020), 7190–7198.
[20]
Andrey Ignatov. 2018. Real-time human activity recognition from accelerometer data using convolutional neural networks. Appl. Soft Comput. 62 (2018), 915–922.
[21]
Zan Gao, Hai-Zhen Xuan, Hua Zhang, Shaohua Wan, and Kim-Kwang Raymond Choo. 2019. Adaptive fusion and category-level dictionary learning model for multiview human action recognition. IEEE Internet Things J. 6, 6 (2019), 9280–9293.
[22]
Francisco Javier Ordóñez and Daniel Roggen. 2016. Deep convolutional and LSTM recurrent neural networks for multimodal wearable activity recognition. Sensors 16, 1 (2016), 115.
[23]
Hao Li, Asim Kadav, Igor Durdanovic, Hanan Samet, and Hans Peter Graf. 2016. Pruning filters for efficient convNets. arXiv preprint arXiv:1608.08710 (2016).
[24]
Song Han, Huizi Mao, and William J. Dally. 2015. Deep compression: Compressing deep neural networks with pruning, trained quantization and Huffman coding. arXiv preprint arXiv:1510.00149 (2015).
[25]
Dong Wang, Lei Zhou, Xueni Zhang, Xiao Bai, and Jun Zhou. 2018. Exploring linear relationship in feature map subspace for convNets compression. arXiv preprint arXiv:1803.05729 (2018).
[26]
Jingjing Cao, Wenfeng Li, Congcong Ma, and Zhiwen Tao. 2018. Optimizing multi-sensor deployment via ensemble pruning for wearable activity recognition. Inf. Fusion 41 (2018), 68–79.
[27]
Seul-Ki Yeom, Philipp Seegerer, Sebastian Lapuschkin, Alexander Binder, Simon Wiedemann, Klaus-Robert Müller, and Wojciech Samek. 2021. Pruning by explaining: A novel criterion for deep neural network pruning. Pattern Recog. 115 (2021), 107899.
[28]
Zhenghua Chen, Chaoyang Jiang, and Lihua Xie. 2018. A novel ensemble ELM for human activity recognition using smartphone sensors. IEEE Trans. Industr. Inform. 15, 5 (2018), 2691–2699.
[29]
Huiyuan Zhuo, Xuelin Qian, Yanwei Fu, Heng Yang, and Xiangyang Xue. 2018. SCSP: Spectral clustering filter pruning with soft self-adaption manners. arXiv preprint arXiv:1806.05320 (2018).
[30]
Fanxu Meng, Hao Cheng, Ke Li, Zhixin Xu, Rongrong Ji, Xing Sun, and Guangming Lu. 2020. Filter grafting for deep neural networks. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition. 6599–6607.
[31]
Vishvak S. Murahari and Thomas Plötz. 2018. On attention models for human activity recognition. In Proceedings of the ACM International Symposium on Wearable Computers. 100–103.
[32]
Kun Wang, Jun He, and Lei Zhang. 2019. Attention-based convolutional neural network for weakly labeled human activities’ recognition with wearable sensors. IEEE Sensors J. 19, 17 (2019), 7598–7604.
[33]
Haojie Ma, Wenzhong Li, Xiao Zhang, Songcheng Gao, and Sanglu Lu. 2019. AttnSense: Multi-level attention mechanism for multimodal human activity recognition. In Proceedings of the International Joint Conferences on Artificial Intelligence. 3109–3115.
[34]
Ying Zhang, Tao Xiang, Timothy M. Hospedales, and Huchuan Lu. 2018. Deep mutual learning. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition. 4320–4328.
[35]
Marcin Straczkiewicz, Peter James, and Jukka-Pekka Onnela. 2021. A systematic review of smartphone-based human activity recognition methods for health research. Nat. Partn. J.: Digit. Med. 4, 1 (2021), 1–15.
[36]
Yu Guan and Thomas Plötz. 2017. Ensembles of deep LSTM learners for activity recognition using wearables. Proc. ACM Interact., Mob., Wear. Ubiq 1, 2 (2017), 1–28.
[37]
Davide Anguita, Alessandro Ghio, Luca Oneto, Xavier Parra, and Jorge L. Reyes-Ortiz. 2012. Human activity recognition on smartphones using a multiclass hardware-friendly support vector machine. In Proceedings of the International Workshop on Ambient Assisted Living. Springer, 216–223.
[38]
Daniel Roggen, Alberto Calatroni, Mirco Rossi, Thomas Holleczek, Kilian Förster, Gerhard Tröster, Paul Lukowicz, David Bannach, Gerald Pirkl, Alois Ferscha, et al. 2010. Collecting complex activity datasets in highly rich networked sensor environments. In Proceedings of the 7th International Conference on Networked Sensing Systems (INSS). IEEE, 233–240.
[39]
Daniela Micucci, Marco Mobilio, and Paolo Napoletano. 2017. UniMiB SHAR: A dataset for human activity recognition using acceleration data from smartphones. Appl. Sci. 7, 10 (2017), 1101.
[40]
Attila Reiss and Didier Stricker. 2012. Introducing a new benchmarked dataset for activity monitoring. In Proceedings of the 16th International Symposium on Wearable Computers. IEEE, 108–109.
[41]
Jeffrey W. Lockhart, Gary M. Weiss, Jack C. Xue, Shaun T. Gallagher, Andrew B. Grosner, and Tony T. Pulickal. 2011. Design considerations for the WISDM smart phone-based sensor mining architecture. In Proceedings of the 5th International Workshop on Knowledge Discovery from Sensor Data. 25–33.
[42]
Mi Zhang and Alexander A. Sawchuk. 2012. USC-HAD: A daily activity dataset for ubiquitous activity recognition using wearable sensors. In Proceedings of the ACM Conference on Ubiquitous Computing. 1036–1043.
[43]
Zanobya N. Khan and Jamil Ahmad. 2021. Attention induced multi-head convolutional neural network for human activity recognition. Appl. Soft Comput. 110 (2021), 107671.
[44]
Qi Teng, Kun Wang, Lei Zhang, and Jun He. 2020. The layer-wise training convolutional neural networks using local loss for sensor-based human activity recognition. IEEE Sensors J. 20, 13 (2020), 7265–7274.
[45]
Alireza Abedin, Mahsa Ehsanpour, Qinfeng Shi, Hamid Rezatofighi, and Damith C. Ranasinghe. 2021. Attend and discriminate: Beyond the state-of-the-art for human activity recognition using wearable sensors. Proc. ACM Interact., Mob., Wear. Ubiq 5, 1 (2021), 1–22.
[46]
Kun Wang, Jun He, and Lei Zhang. 2021. Sequential weakly labeled multiactivity localization and recognition on wearable sensors using recurrent attention networks. IEEE Trans. Hum.-Mach. Syst. 51, 4 (2021), 355–364.
[47]
Hangwei Qian, Sinno Jialin Pan, and Chunyan Miao. 2021. Latent independent excitation for generalizable sensor-based cross-person activity recognition. In Proceedings of the AAAI Conference on Artificial Intelligence. AAAI, 11921–11929.
[48]
Mohammed A. A. Al-qaness, Abdelghani Dahou, Mohamed Abd Elaziz, and A. M. Helmi. 2022. Multi-ResAtt: Multilevel residual network with attention for human activity recognition using wearable sensors. IEEE Trans. Industr. Inform. Early access. DOI:
[49]
Songpengcheng Xia, Lei Chu, Ling Pei, Zixuan Zhang, Wenxian Yu, and Robert C. Qiu. 2021. Learning disentangled representation for mixed-reality human activity recognition with a single IMU sensor. IEEE Trans. Instrum. Measur. 70 (2021), 1–14.
[50]
Wesllen Sousa Lima, Hendrio L. S. Bragança, and Eduardo J. P. Souto. 2021. NOHAR-NOvelty discrete data stream for human activity recognition based on smartphones with inertial sensors. Exp. Syst. Applic. 166 (2021), 114093.
[51]
Luay Alawneh, Belal Mohsen, Mohammad Al-Zinati, Ahmed Shatnawi, and Mahmoud Al-Ayyoub. 2020. A comparison of unidirectional and bidirectional LSTM networks for human activity recognition. In Proceedings of the IEEE International Conference on Pervasive Computing and Communications Workshops (PerCom Workshops). IEEE, 1–6.
[52]
Min Zhang. 2019. Gait activity authentication using LSTM neural networks with smartphone sensors. In Proceedings of the 15th International Conference on Mobile Ad-Hoc and Sensor Networks (MSN). IEEE, 456–461.
[53]
Xiaojie Sun, Hongji Xu, Zheng Dong, Leixin Shi, Qiang Liu, Juan Li, Tiankuo Li, Shidi Fan, and Yuhao Wang. 2022. CapsGaNet: Deep neural network based on capsule and GRU for human activity recognition. IEEE Syst. J. Early access. DOI:
[54]
Chenglin Li, Di Niu, Bei Jiang, Xiao Zuo, and Jianming Yang. 2021. Meta-HAR: Federated representation learning for human activity recognition. In Proceedings of the International World Wide Web Conferences (WWW). ACM, 912–922.
[55]
Haixia Bi, Miquel Perello-Nieto, Raul Santos-Rodriguez, and Peter Flach. 2020. Human activity recognition based on dynamic active learning. IEEE J. Biomed. Health Inform. 25, 4 (2020), 922–934.
[56]
Satya P. Singh, Madan Kumar Sharma, Aimé Lay-Ekuakille, Deepak Gangwar, and Sukrit Gupta. 2020. Deep ConvLSTM with self-attention for human activity decoding using wearable sensors. IEEE Sensors J. 21, 6 (2020), 8575–8582.
[57]
Shohreh Deldari, Daniel V. Smith, Hao Xue, and Flora D. Salim. 2021. Time series change point detection with self-supervised contrastive predictive coding. In Proceedings of the Web Conference. ACM, 3124–3135.

Cited By

View all
  • (2024)Modeling of Hyperparameter Tuned Fuzzy Deep Neural Network–Based Human Activity Recognition for Disabled PeopleJournal of Mathematics10.1155/2024/55510092024:1Online publication date: 6-Nov-2024
  • (2024)Optimization-Free Test-Time Adaptation for Cross-Person Activity RecognitionProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36314507:4(1-27)Online publication date: 12-Jan-2024
  • (2024)Human activity recognition: A comprehensive reviewExpert Systems10.1111/exsy.1368041:11Online publication date: 27-Jul-2024
  • Show More Cited By

Index Terms

  1. Deep Ensemble Learning for Human Activity Recognition Using Wearable Sensors via Filter Activation

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Transactions on Embedded Computing Systems
    ACM Transactions on Embedded Computing Systems  Volume 22, Issue 1
    January 2023
    512 pages
    ISSN:1539-9087
    EISSN:1558-3465
    DOI:10.1145/3567467
    • Editor:
    • Tulika Mitra
    Issue’s Table of Contents

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Journal Family

    Publication History

    Published: 29 October 2022
    Online AM: 26 July 2022
    Accepted: 22 July 2022
    Revised: 23 June 2022
    Received: 10 January 2022
    Published in TECS Volume 22, Issue 1

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. Sensor
    2. convolutional neural network
    3. human activity recognition
    4. deep learning
    5. filter activation

    Qualifiers

    • Research-article

    Funding Sources

    • Natural Science Foundation of Jiangsu Province
    • National Nature Science Foundation of China
    • Industry Academia Cooperation Innovation Fund Projection of Jiangsu Province

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)241
    • Downloads (Last 6 weeks)27
    Reflects downloads up to 20 Jan 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)Modeling of Hyperparameter Tuned Fuzzy Deep Neural Network–Based Human Activity Recognition for Disabled PeopleJournal of Mathematics10.1155/2024/55510092024:1Online publication date: 6-Nov-2024
    • (2024)Optimization-Free Test-Time Adaptation for Cross-Person Activity RecognitionProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36314507:4(1-27)Online publication date: 12-Jan-2024
    • (2024)Human activity recognition: A comprehensive reviewExpert Systems10.1111/exsy.1368041:11Online publication date: 27-Jul-2024
    • (2024)DiTMoS: Delving into Diverse Tiny-Model Selection on Microcontrollers2024 IEEE International Conference on Pervasive Computing and Communications (PerCom)10.1109/PerCom59722.2024.10494422(69-79)Online publication date: 11-Mar-2024
    • (2024)Efficient Channel-Temporal Attention for Boosting RF FingerprintingIEEE Open Journal of Signal Processing10.1109/OJSP.2024.33626955(478-492)Online publication date: 2024
    • (2024)Real-Time Fall Recognition Using a Lightweight Convolution Neural Network Based on Millimeter-Wave RadarIEEE Sensors Journal10.1109/JSEN.2024.335242524:5(7185-7195)Online publication date: 1-Mar-2024
    • (2024)HARWEPattern Recognition Letters10.1016/j.patrec.2024.06.017184:C(126-132)Online publication date: 18-Oct-2024
    • (2024)A wearing orientation-independent electromagnetic self-powered sensor for human activity recognition based on biomechanical energy scavengingMeasurement10.1016/j.measurement.2023.113926224(113926)Online publication date: Jan-2024
    • (2024)Improving hedonic housing price models by integrating optimal accessibility indices into regression and random forest analysesExpert Systems with Applications10.1016/j.eswa.2023.121059235(121059)Online publication date: Jan-2024
    • (2024)Cross comparison representation learning for semi-supervised segmentation of cellular nuclei in immunofluorescence stainingComputers in Biology and Medicine10.1016/j.compbiomed.2024.108102171(108102)Online publication date: Mar-2024
    • Show More Cited By

    View Options

    Login options

    Full Access

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Full Text

    View this article in Full Text.

    Full Text

    HTML Format

    View this article in HTML Format.

    HTML Format

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media