Skip to main content
Log in

Enhancing Representation of Deep Features for Sensor-Based Activity Recognition

  • Published:
Mobile Networks and Applications Aims and scope Submit manuscript

Abstract

Sensor-based activity recognition (AR) depends on effective feature representation and classification. However, many recent studies focus on recognition methods, but largely ignore feature representation. Benefitting from the success of Convolutional Neural Networks (CNN) in feature extraction, we propose to improve the feature representation of activities. Specifically, we use a reversed CNN to generate the significant data based on the original features and combine the raw training data with significant data to obtain to enhanced training data. The proposed method can not only train better feature extractors but also help better understand the abstract features of sensor-based activity data. To demonstrate the effectiveness of our proposed method, we conduct comparative experiments with CNN Classifier and CNN-LSTM Classifier on five public datasets, namely the UCIHAR, UniMiB SHAR, OPPORTUNITY, WISDM, and PAMAP2. In addition, we evaluate our proposed method in comparison with traditional methods such as Decision Tree, Multi-layer Perceptron, Extremely randomized trees, Random Forest, and k-Nearest Neighbour on a specific dataset, WISDM. The results show our proposed method consistently outperforms the state-of-the-art methods.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14
Fig. 15
Fig. 16
Fig. 17
Fig. 18
Fig. 19
Fig. 20
Fig. 21
Fig. 22

Similar content being viewed by others

References

  1. Anguita D, Ghio A, Oneto L, Parra X, Reyes-Ortiz JL (2013) A Public Domain Dataset for Human Activity Recognition Using Smartphones. 21th European Symposium on Artificial Neural Networks, Computational Intelligence and Machine Learning, ESANN 2013. Bruges, Belgium 24–26 April 2013

  2. Micucci D, Mobilio M, Napoletano P, A Dataset for Human Activity Recognition Using Acceleration Data from Smartphones

  3. Chavarriaga R, Sagha H, Calatroni A, Digumarti ST, Tröster G (2013) The Opportunity challenge: a benchmark database for on-body sensor-based activity recognition. Pattern Recogn Lett 34(2013):2033–2042

    Article  Google Scholar 

  4. Kwapisz JR, Weiss GM, Moore SA (2010) Activity recognition using cell phone accelerometers, Proceedings of the fourth international workshop on knowledge discovery from sensor data (at KDD-10), Washington DC

  5. Reiss A, Stricker D (2012) Introducing a new benchmarked dataset for activity Monitoring.International symposium on wearable computers (Vol.497, pp.108-109). IEEE computer society

  6. Hevesi P, Wille S, Pirkl G, Wehn N, Lukowicz P (2014) Monitoring household activities and user location with a cheap, unobtrusive thermal sensor array. ACM international joint conference on pervasive and ubiquitous computing (pp.141-145). ACM

  7. Li X, Zhang Y, Marsic I, Sarcevic A, Burd RS (2016) Deep learning for RFID-based activity recognition. ACM conference on embedded network sensor systems cd-rom (pp.164-175). ACM

  8. Zhao Y, Lamarca A, Smith JR (2014) A battery-free object localization and motion sensing platform. ACM international joint conference on pervasive and ubiquitous computing (pp.255-259). ACM

  9. Rault T, Bouabdallah A, Challal Y, Marin F (2014) Context-aware energy-efficient wireless sensor architecture for body activity recognition. IEEE international conference on pervasive computing and communications workshops (Vol.6, pp.203-206). IEEE

  10. Scholz M, Riedel T, Hock M et al. (2013) Device-free and device-bound activity recognition using radio signal strength[C]. Augmented human international conference, 2013: 100-107

  11. Device-free Wireless Localization and Activity Recognition with Deep Learning

  12. Lee YS, Cho SB (2011) Activity recognition using hierarchical hidden Markov models on a smartphone with 3D accelerometer. International conference on hybrid artificial intelligent systems (Vol.6678, pp.460-467). Springer-Verlag

  13. Anjum A, Ilyas MU (2013) Activity recognition using smartphone sensors. Consumer communications and NETWORKING conference (pp.914-919). IEEE

  14. Jaeyoung Y, Joonwhan J, Choi (2011) Activity recognition based on rfid object usage for smart mobile devices. Journal of Computer Science & Technology 26(2):239–246

    Article  Google Scholar 

  15. Fortinsimard D, Bilodeau JS, Bouchard K, Gaboury S, Bouchard B, Bouzouane A (2015) Exploiting passive rfid technology for activity recognition in smart homes. IEEE Intell Syst 30(4):7–15

    Article  Google Scholar 

  16. Chen L, Hoey J, Nugent CD, Cook DJ, Yu Z (2012) Sensor-based activity recognition. IEEE Transactions on Systems Man & Cybernetics Part C 42(6):790–808

    Article  Google Scholar 

  17. Wang H, Zhou H, Finn A (2014) Discriminative dictionary learning via shared latent structure for object recognition and activity recognition. IEEE international conference on robotics and automation (pp.6299-6304). IEEE

  18. Chen L, Nugent CD, Wang H (2012) A knowledge-driven approach to activity recognition in smart homes. IEEE Transactions on Knowledge & Data Engineering 24(6):961–974

    Article  Google Scholar 

  19. Zhong M, Wen J, Hu P, Indulska J (2015) Advancing android activity recognition service with Markov smoother. IEEE international conference on pervasive computing and communication workshops (pp.38-43). IEEE

  20. Hossain HMS, Khan MAAH, Roy N (2016) Active learning enabled activity recognition. Pervasive & Mobile Computing, 1-9

  21. Phan T (2014) Improving activity recognition via automatic decision tree pruning. Ubiquitous computing

  22. Anguita D, Ghio A, Oneto L, Parra X, Reyes-Ortiz JL (2012) Human activity recognition on smartphones using a multiclass hardware-friendly support vector machine. International conference on ambient assisted living and home care (Vol.7657, pp.216-223). Springer-Verlag

  23. Yang X, Dinh A, Chen L (2010) Implementation of a wearerable real-time system for physical activity recognition based on naive Bayes classifier. International conference on bioinformatics and biomedical technology (pp.101-105). IEEE

  24. Han CW, Kang SJ, Soo KN (2010) Implementation of hmm-based human activity recognition using single triaxial accelerometer. Ieice Transactions on Fundamentals of Electronics Communications & Computer Sciences 93(7):1379–1383

    Article  Google Scholar 

  25. Multi-scale Conditional Random Fields for First-Person Activity Recognition

  26. Bhattacharya S, Lane ND (2016) From smart to deep: robust activity recognition on smartwatches using deep learning. IEEE international conference on pervasive computing and communication workshops (pp.1-6). IEEE

  27. Zeng M, Le TN, Yu B, Mengshoel OJ, Zhu J, Wu P et al. (2015) Convolutional neural networks for human activity recognition using Mobile sensors. International conference on Mobile computing, applications and services (pp.197-205). IEEE

  28. Yang JB, Nguyen MN, San PP, Li XL, Krishnaswamy S (2015) Deep convolutional neural networks on multichannel time series for human activity recognition. International conference on artificial intelligence (pp.3995-4001). AAAI press

  29. Chen K, Yao L, Gu T, Yu Z, Wang X, Zhang D (2017) Fullie and Wiselie: a dual-stream recurrent convolutional attention model for activity recognition. arXiv preprint arXiv:1711.07661

  30. Guan Y, Plötz T (2017) Ensembles of deep lstm learners for activity recognition using wearables

  31. Ordóñez FJ, Roggen D (2016) Deep convolutional and lstm recurrent neural networks for multimodal wearable activity recognition. Sensors 16(1):115

    Article  Google Scholar 

  32. Hammerla NY, Halloran S, Ploetz T (2016) Deep, convolutional, and recurrent models for human activity recognition using wearables. J Sci Comput 61(2):454–476

    Google Scholar 

  33. Zeiler MD, Taylor GW, Fergus R (2011) Adaptive deconvolutional networks for mid and high level feature learning. In: Proc. of the Int’lConf. On computer vision. IEEE, 2011. 2018–2025. https://doi.org/10.1109/ICCV.2011.6126474

  34. Krizhevsky A, Sutskever I, Hinton GE (2012) Imagenet classification with deep convolutional neural networks. In: Proc. of the Advancesin neural information processing systems. 2012. 1097−1105

  35. Samek W, Binder A, Montavon G, Lapuschkin S, Müller KR (2016) Evaluating the visualization of what a deep neural network has learned. IEEE Trans on Neural Networks and Learning Systems 2016:2660–2673. https://doi.org/10.1109/TNNLS.2016.2599820

    Article  MathSciNet  Google Scholar 

  36. Understanding and Improving Deep Neural Network for Activity Recognition, arXiv:1805.07020

  37. Zhou B, Khosla A, Lapedriza A et al. (2016) Learning deep features for discriminative localization[J]. Computer vision and pattern recognition, 2016: 2921–2929

  38. Springenberg JT, Dosovitskiy A, Brox T, Riedmiller M (2015) Striving for simplicity: the all convolutional net.International conference on learning representations 2015

  39. Walse KH, Dharaskar RV, Thakare VM (2016) Performance evaluation of classifiers on WISDM dataset for human activity recognition. International conference on information and communication Technology for Competitive Strategies (pp.1-7). ACM

  40. Baldominos A, Isasi P, Saez Y (2017) Feature selection for physical activity recognition using genetic algorithms. Evolutionary Computation. IEEE

  41. Roggen D et al. (2016) Deep convolutional and LSTM recurrent neural networks for multimodal wearable activity recognition.[J]Sensors (Switzerland), v 16, n 1, January 18, 2016

  42. Ivascu T, Cincar K, Dinis A, Negru V, Ivascu T, Cincar K et al. (2017) Activities of daily living and falls recognition and classification from the wearable sensors data. E-health and bioengineering conference (pp.627-630)

  43. Reyes-Ortiz JL, Ghio A, Parra-Llanas X, Anguita D, Cabestany J, Català A (2013) Human Activity and Motion Disorder Recognition: Towards Smarter Interactive Cognitive Environments. 21th European Symposium on Artificial Neural Networks, Computational Intelligence and Machine Learning, ESANN 2013. Bruges, Belgium 24–26 April 2013

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Xue Li.

Additional information

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Li, X., Nie, L., Si, X. et al. Enhancing Representation of Deep Features for Sensor-Based Activity Recognition. Mobile Netw Appl 26, 130–145 (2021). https://doi.org/10.1007/s11036-020-01689-y

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11036-020-01689-y

Keywords

Navigation