Skip to main content

Advertisement

Log in

A Survey of Deep Learning Based Models for Human Activity Recognition

  • Published:
Wireless Personal Communications Aims and scope Submit manuscript

Abstract

Human Activity Recognition (HAR) is a process of recognizing human activities automatically based on streaming data obtained from various sensors, such as, inertial sensors, physiological sensors, location sensors, camera, time and many more environmental sensors. HAR has proven to be beneficial in various fields of study especially in healthcare, aged-care, ambient living, personal care, social science, rehabilitation engineering and many other domains. Due to the recent advancements in computing power, deep learning-based algorithms have become most effective and efficient choice of algorithms for recognizing and solving HAR problems. In this survey, we categorize recent research work with respect to various factors and measures to investigate the recent trends in HAR using deep learning algorithms. The articles are analyzed in various aspects, such as those related to HAR, time series analysis, machine learning models, methods of dataset creation, and use of various other new trends such as transfer learning, active learning, etc.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3

Similar content being viewed by others

References

  1. Zhang, H.-B., et al. (2019). A comprehensive survey of vision-based human action recognition methods. Sensors, 19(5), 1005. https://doi.org/10.3390/s19051005

    Article  Google Scholar 

  2. Zhang, S., Wei, Z., Nie, J., Huang, L., Wang, S., Li, Z. (2017). A review on human activity recognition using vision-based method. Journal of Healthcare Engineering. https://www.hindawi.com/journals/jhe/2017/3090343/ (accessed Jul. 10, 2019).

  3. Wang, J., Chen, Y., Hao, S., Peng, X., Hu, L. (2017). Deep learning for sensor-based activity recognition: a survey. Pattern Recognition Letter, Accessed: Dec. 07, 2018. [Online]. Available: /paper/Deep-Learning-for-Sensor-based-Activity-A-Survey-Wang-Chen/c2420b5f507015ba735f27f8709706cf01be1c2b.

  4. Yang, J.B., Nguyen, M.N., San, P.P., Li, X.L., Krishnaswamy, S. (2015). Deep convolutional neural networks on multichannel time series for human activity recognition. Twenty-Fourth International Joint Conference on Artificial Intelligence, p. 7.

  5. Shoaib, M., et al. (2016). Complex human activity recognition using smartphone and wrist-worn motion sensors. Sensors, 16(4), 426. https://doi.org/10.3390/s16040426

    Article  Google Scholar 

  6. Gani, M.O., Saha, A. K., Ahsan, G.M.T., Ahamed, S.I. (2018). A novel framework to recognize complex human activity, Accessed: Nov. 07, 2018. [Online]. Available: /paper/A-Novel-Framework-to-Recognize-Complex-Human-Gani-Saha/1024a5b27c233a452bbca412c322943ae02b9a7c.

  7. Wahle, F., Kowatsch, T., Fleisch, E., Rufer, M., & Weidt, S. (2016). Mobile sensing and support for people with depression: a pilot trial in the wild. JMIR MHealth UHealth. https://doi.org/10.2196/mhealth.5960

    Article  Google Scholar 

  8. Jeong, T., Klabjan, D., Starren, J. (2016). Predictive analytics using smartphone sensors for depressive episodes. ArXiv160307692 Cs Stat, Mar. 2016, Accessed: Mar. 11, 2019. [Online]. Available: http://arxiv.org/abs/1603.07692.

  9. Moya Rueda, F., Grzeszick, R., Fink, G. A., Feldhorst, S., & ten Hompel, M. (2018). Convolutional neural networks for human activity recognition using body-worn sensors. Informatics, 5(2), 26. https://doi.org/10.3390/informatics5020026

    Article  Google Scholar 

  10. Yao, S., Hu, S., Zhao, Y., Zhang, A., Abdelzaher, T. (2016). DeepSense: A unified deep learning framework for time-series mobile sensing data processing. ArXiv161101942 Cs. Accessed: Nov. 07, 2018. [Online]. Available: http://arxiv.org/abs/1611.01942.

  11. M. Zeng et al., (2014). Convolutional Neural Networks for Human Activity Recognition using Mobile Sensors. 6th International Conference on Mobile Computing, Applications and Services, Austin, United States. Doi: https://doi.org/10.4108/icst.mobicase.2014.257786.

  12. Ciresan, D., Meier, U., Schmidhuber, J. (2012). Multi-column deep neural networks for image classification. In 2012 IEEE Conference on Computer Vision and Pattern Recognition, Providence, RI, pp. 3642–3649. Doi: https://doi.org/10.1109/CVPR.2012.6248110.

  13. Krizhevsky, A., Sutskever, I., Hinton, G.E. (2012). ImageNet classification with deep convolutional neural networks. Advances in Neural Information Processing Systems 25, F. Pereira, C. J. C. Burges, L. Bottou, and K. Q. Weinberger, Eds. Curran Associates, Inc., pp. 1097–1105.

  14. Google’s AlphaGo AI wins three-match series against the world’s best Go player | TechCrunch. https://techcrunch.com/2017/05/24/alphago-beats-planets-best-human-go-player-ke-jie/ (accessed Jun. 25, 2019).

  15. Bao, L., Intille, S.S. (2004). Activity recognition from user-annotated acceleration data. Pervasive Computing, pp. 1–17.

  16. Krause, A., Siewiorek, D. P., Smailagic, A., Farringdon, J. (2003). Unsupervised, dynamic identification of physiological and activity context in wearable computing. In Seventh IEEE International Symposium on Wearable Computers. Proceedings., Oct. 2003, pp. 88–97. Doi: https://doi.org/10.1109/ISWC.2003.1241398.

  17. Plötz, T., Hammerla, N.Y., Olivier, P. (2011). Feature learning for activity recognition in ubiquitous computing. In Proceedings of the Twenty-Second International Joint Conference on Artificial Intelligence, volume 2, Barcelona, Catalonia, Spain, pp. 1729–1734. Doi: https://doi.org/10.5591/978-1-57735-516-8/IJCAI11-290.

  18. Slim, S., Atia, A., & M. M.A., and M.-S. Mostafa, . (2019). Survey on human activity recognition based on acceleration data. International Journal of Advanced Computer Science. https://doi.org/10.14569/IJACSA.2019.0100311

    Book  Google Scholar 

  19. Ramasamy Ramamurthy, S., & Roy, N. (2018). Recent trends in machine learning for human activity recognition-A survey. Reviews of Data Mining Knowledge Discovery., 8(4), e1254. https://doi.org/10.1002/widm.1254

    Article  Google Scholar 

  20. Alrazzak, U., Alhalabi, B. (2019). A survey on human activity recognition using accelerometer sensor. In Joint 8th International Conference on Informatics, Electronics & Vision (ICIEV) and 2019 3rd International Conference on Imaging, Vision & Pattern Recognition (icIVPR), Spokane, WA, USA, pp. 152–159. Doi: https://doi.org/10.1109/ICIEV.2019.8858578.

  21. Li, X., He, Y., & Jing, X. (2019). A survey of deep learning-based human activity recognition in radar. Remote Sensor, 11(9), 1068. https://doi.org/10.3390/rs11091068

    Article  Google Scholar 

  22. Aguileta, A. A., Brena, R. F., Mayora, O., Molino-Minero-Re, E., & Trejo, L. A. (2019). Multi-sensor fusion for activity recognition—a survey. Sensors, 19(17), 3808. https://doi.org/10.3390/s19173808

    Article  Google Scholar 

  23. Sakr, N.A., Abu-Elkheir, M., Atwan, A., Soliman, H.H. (2018). Current trends in complex human activity recognition. 14, 20.

  24. Mobark, M., & Chuprat, S. (2018). Recognition of complex human activity using mobile phones: a systematic literature review. Journal of Theoretical and Applied Information Technology, 96(12), 24

    Google Scholar 

  25. Sousa Lima, W., Souto, E., El-Khatib, K., Jalali, R., & Gama, J. (2019). Human activity recognition using inertial sensors in a smartphone: an overview. Sensors, 19(14), 3213. https://doi.org/10.3390/s19143213

    Article  Google Scholar 

  26. Jobanputra, C., Bavishi, J., & Doshi, N. (2019). Human activity recognition: A survey. Procedia Computing Science, 155, 698–703. https://doi.org/10.1016/j.procs.2019.08.100

    Article  Google Scholar 

  27. Hussain, Z., Sheng, M, Zhang, W. E. (2019). Different approaches for human activity recognition: a survey. ArXiv190605074 Cs Accessed: Mar. 07, 2020. [Online]. Available: http://arxiv.org/abs/1906.05074.

  28. R. Elbasiony and W. Gomaa, (2020). A Survey on Human Activity Recognition Based on Temporal Signals of Portable Inertial Sensors. The International Conference on Advanced Machine Learning Technologies and Applications (AMLTA2019), Cham, vol. 921, pp. 734–745. Doi: https://doi.org/10.1007/978-3-030-14118-9_72.

  29. Qi, W., Su, H., Yang, C., Ferrigno, G., De Momi, E., & Aliverti, A. (2019). A fast and robust deep convolutional neural networks for complex human activity recognition using smartphone. Sensors. https://doi.org/10.3390/s19173731

    Article  Google Scholar 

  30. Stisen, A. et al. (2015). Smart devices are different: assessing and mitigatingmobile sensing heterogeneities for activity recognition. In Proceedings of the 13th ACM Conference on Embedded Networked Sensor Systems - SenSys ’15, Seoul, South Korea, pp. 127–140. Doi: https://doi.org/10.1145/2809695.2809718.

  31. Zappi ,P. et al. (2008). Activity recognition from on-body sensors: accuracy-power trade-off by dynamic sensor selection. Wireless Sensor Networks, pp. 17–33.

  32. Kwapisz, J. R., Weiss, G. M., & Moore, S. A. (2011). Activity recognition using cell phone accelerometers. SIGKDD Exploring Newsletter, 12(2), 74–82. https://doi.org/10.1145/1964897.1964918

    Article  Google Scholar 

  33. Bulling, A., Blanke, U., Schiele, B. (2014). A tutorial on human activity recognition using body-worn inertial sensors. ACM Computing Survey, 46(3), 33:1–33:33. Doi: https://doi.org/10.1145/2499621.

  34. Anguita, D., Ghio, A., Oneto, L., Parra, X., Reyes-Ortiz, J. L. (2013). A public domain dataset for human activity recognition using smartphones. In European Symposium on Artificial Neural Networks, Computational Intelligence and Machine Learning, Bruges Belgium, [Online]. Available: https://archive.ics.uci.edu/ml/datasets/human+activity+recognition+using+smartphones.

  35. Bächlin, M., et al. (2010). Wearable assistant for parkinson’s disease patients with the freezing of gait symptom. Information Technology Biomedical IEEE Transactions On, 14, 436–446

    Article  Google Scholar 

  36. Roggen, D. et al. (2010). Collecting complex activity datasets in highly rich networked sensor environments. In 2010 Seventh International Conference on Networked Sensing Systems (INSS), pp. 233–240. Doi: https://doi.org/10.1109/INSS.2010.5573462.

  37. Chavarriaga, R., et al. (2013). The Opportunity challenge: A benchmark database for on-body sensor-based activity recognition. Pattern Recognition Letter, 34(15), 2033–2042. https://doi.org/10.1016/j.patrec.2012.12.014

    Article  Google Scholar 

  38. Reiss, A., Stricker, D. (2012). Introducing a new benchmarked dataset for activity monitoring. In 16th International Symposium on Wearable Computers, pp 108–109. Doi: https://doi.org/10.1109/ISWC.2012.13.

  39. Toshev, A., Szegedy, C. (2014). DeepPose: Human pose estimation via deep neural networks. 2014 IEEE Conference on Computer Vision and Pattern Recognition, pp. 1653–1660. Doi: https://doi.org/10.1109/CVPR.2014.214.

  40. van Kasteren, T.L.M., Englebienne, G., Kröse, B.J.A. (2011). Human activity recognition from wireless sensor network data: benchmark and software. Activity Recognition in Pervasive Intelligent Environments, L. Chen, C. D. Nugent, J. Biswas, and J. Hoey, Eds. Paris: Atlantis Press, pp. 165–186.

  41. Mauldin, T. R., Canby, M. E., Metsis, V., Ngu, A. H. H., & Rivera, C. C. (2018). SmartFall: A Smartwatch-based fall detection system using deep learning. Sensors. https://doi.org/10.3390/s18103363

    Article  Google Scholar 

  42. Zhang, Y., Zhang, Y., Zhang, Z., Song, Y. Human activity recognition based on time series analysis using U-Net. ArXiv preprint arXiv:1809.08113, p. 21.

  43. Khan, M.A.A.H., Roy, N., Misra, A., Scaling, (2018). Human activity recognition via deep learning-based domain adaptation. In 2018 IEEE International Conference on Pervasive Computing and Communications (PerCom), Athens, pp. 1–9. Doi: https://doi.org/10.1109/PERCOM.2018.8444585.

  44. T. Sztyler and H. Stuckenschmidt, (2016). On-body localization of wearable devices: An investigation of position-aware activity recognition. in 2016 IEEE International Conference on Pervasive Computing and Communications (PerCom), Sydney, Australia, pp. 1–9. Doi: https://doi.org/10.1109/PERCOM.2016.7456521.

  45. Ribeiro, N.F., Santos, C.P. (2017). Inertial measurement units: A brief state of the art on gait analysis. In 2017 IEEE 5th Portuguese Meeting on Bioengineering (ENBENG), pp. 1–4. Doi: https://doi.org/10.1109/ENBENG.2017.7889458.

  46. Grzeszick, R., Lenk, J. M., Rueda, F. M., Fink, G. A., Feldhorst, S., ten Hompel, M. (2017). Deep neural network based human activity recognition for the order picking process. In Proceedings of the 4th international Workshop on Sensor-based Activity Recognition and Interaction - iWOAR ’17, Rostock, Germany, pp. 1–6. Doi: https://doi.org/10.1145/3134230.3134231.

  47. Zhang, M., Sawchuk, A. A. (2012). USC-HAD: a daily activity dataset for ubiquitous activity recognition using wearable sensors. In Proceedings of the 2012 ACM Conference on Ubiquitous Computing - UbiComp ’12, Pittsburgh, Pennsylvania, p. 1036. Doi: https://doi.org/10.1145/2370216.2370438.

  48. Zebin, T., Scully, P., Ozanyan, K. (2017). Evaluation of supervised classification algorithms for human activity recognition with inertial sensors, pp. 1–3. Doi: https://doi.org/10.1109/ICSENS.2017.8234222.

  49. McGinnis, R. S., et al. (2019). Rapid detection of internalizing diagnosis in young children enabled by wearable sensors and machine learning. PLoS ONE, 14(1), e0210267. https://doi.org/10.1371/journal.pone.0210267

    Article  Google Scholar 

  50. Bevilacqua, A., MacDonald, K., Rangarej, A., Widjaya, V., Caulfield, B., Kechadi, T. (2019). Human activity recognition with convolutional neural netowrks. ArXiv190601935 Cs Stat, vol. 11053, pp. 541–552, 2019. Doi: https://doi.org/10.1007/978-3-030-10997-4_33.

  51. Altun, K., Barshan, B., & Tunçel, O. (2010). Comparative study on classifying human activities with miniature inertial and magnetic sensors. Pattern Recognition, 43(10), 3605–3620. https://doi.org/10.1016/j.patcog.2010.04.019

    Article  MATH  Google Scholar 

  52. Baños, O., Damas, M., Pomares, H., Rojas, I., Tóth, M.A., Amft, O. (2012). A benchmark dataset to evaluate sensor displacement in activity recognition. Proc. 2012 ACM Conf. Ubiquitous Comput. - UbiComp 3912, Accessed: Apr. 05, 2020. [Online]. Available: https://www.academia.edu/13057715/A_benchmark_dataset_to_evaluate_sensor_displacement_in_activity_recognition.

  53. Qi, W., Su, H., Yang, C., Ferrigno, G., De Momi, E., & Aliverti, A. (2019). A fast and robust deep convolutional neural networks for complex human activity recognition using smartphone. Sensors, 19(17), 3731. https://doi.org/10.3390/s19173731

    Article  Google Scholar 

  54. Miranda, D., Calderón, M., Favela, J. (2014). Anxiety detection using wearable monitoring. Proceedings of the 5th Mexican Conference on Human-Computer Interaction, New York, NY, USA, 2014, p. 34:34–34:41. Doi: https://doi.org/10.1145/2676690.2676694.

  55. Sample, A., Ranasinghe, D., Shi, Q. (2013). Sensor enabled wearable rfid technology for mitigating the risk of falls near beds.

  56. wang, K., he, j, & zhang, l. (2019). attention-based convolutional neural network for weakly labeled human activities recognition with wearable sensors. IEEE Sensors Journal, 19(17), 7598–7604. https://doi.org/10.1109/JSEN.2019.2917225

    Article  Google Scholar 

  57. Reyes-Ortiz, J.-L., Oneto, L., Samà, A., Parra, X., & Anguita, D. (2016). Transition-aware human activity recognition using smartphones. Neurocomputing, 171, 754–767. https://doi.org/10.1016/j.neucom.2015.07.085

    Article  Google Scholar 

  58. Micucci, D., Mobilio, M., Napoletano, P. (2018). UniMiB SHAR: a new dataset for human activity recognition using acceleration data from smartphones. ArXiv161107688 Cs, Accessed: Oct. 30, 2018. [Online]. Available: http://arxiv.org/abs/1611.07688.

  59. Shoaib, M., Bosch, S., Incel, O. D., Scholten, H., & Havinga, P. J. M. (2014). Fusion of smartphone motion sensors for physical activity recognition. Sensors. https://doi.org/10.3390/s140610146

    Article  Google Scholar 

  60. Hassan, M. M., Uddin, Md. Z., Mohamed, A., & Almogren, A. (2018). A robust human activity recognition system using smartphone sensors and deep learning. Future Generation Computer System, 81, 307–313. https://doi.org/10.1016/j.future.2017.11.029

    Article  Google Scholar 

  61. Ravi, D., Wong, C., Lo, B., Yang, G.-Z. (2016). Deep learning for human activity recognition: A resource efficient implementation on low-power devices. In 2016 IEEE 13th International Conference on Wearable and Implantable Body Sensor Networks (BSN), pp. 71–76. Doi: https://doi.org/10.1109/BSN.2016.7516235.

  62. Klenk, J., et al. (2016). The FARSEEING real-world fall repository: a large-scale collaborative database to collect and share sensor signals from real-world falls. European Review of Aging and Physical Activity, 13(1), 8. https://doi.org/10.1186/s11556-016-0168-9

    Article  Google Scholar 

  63. Bhat, G., Chaurasia, V. V., Shill, H., Ogras, U. Y. (2018). Online human activity recognition using low-power wearable devices. In IEEE/ACM International Conference on Computer-Aided Design (ICCAD), San Diego, CA, pp. 1–8, Accessed: Mar. 30, 2020. [Online]. Available: https://www.researchgate.net/publication/327260776_Online_Human_Activity_Recognition_using_Low-Power_Wearable_Devices.

  64. Banos, O., Galvez, J.-M., Damas, M., Pomares, H., & Rojas, I. (2014). Window size impact in human activity recognition. Sensors. https://doi.org/10.3390/s140406474

    Article  Google Scholar 

  65. Cho, H., & Yoon, S. M. (2018). Divide and conquer-based 1D CNN human activity recognition using test data sharpening. Sensors. https://doi.org/10.3390/s18041055

    Article  Google Scholar 

  66. Avilés-Cruz, C., Ferreyra-Ramírez, A., Zúñiga-López, A., & Villegas-Cortéz, J. (2019). Coarse-fine convolutional deep-learning strategy for human activity recognition. Sensors, 19(7), 1556. https://doi.org/10.3390/s19071556

    Article  Google Scholar 

  67. Arifoglu, D., & Bouchachia, A. (2017). Activity recognition and abnormal behaviour detection with recurrent neural networks. Procedia Computing Science, 110, 86–93. https://doi.org/10.1016/j.procs.2017.06.121

    Article  Google Scholar 

  68. Vepakomma, P., De, D., Das, S. Bhansali, S. (2015). A-Wristocracy: Deep learning on wrist-worn sensing for recognition of user complex activities. Doi: https://doi.org/10.1109/BSN.2015.7299406.

  69. Walse, K. H., Dharaskar, R. V., & Thakare, V. M. (2016). PCA based optimal ANN classifiers for human activity recognition using mobile sensors data. Proceedings of First International Conference on Information and Communication Technology for Intelligent Systems:, 1, 429–436

    Google Scholar 

  70. Bengio, Y. (2013). Deep learning of representations: Looking Forward. ArXiv13050445 Cs, Accessed: Jul. 15, 2019. [Online]. Available: http://arxiv.org/abs/1305.0445.

  71. Yang, T., Zhao, L., Li, W., & Zomaya, A. Y. (2020). Reinforcement learning in sustainable energy and electric systems: a survey. Annual Reviews in Control, 49, 145–163. https://doi.org/10.1016/j.arcontrol.2020.03.001

    Article  MathSciNet  Google Scholar 

  72. Paul, A., & Maity, S. P. (2020). Outage analysis in cognitive radio networks with energy harvesting and Q-Routing. IEEE Transactions on Vehicular Technology, 69(6), 6755–6765. https://doi.org/10.1109/TVT.2020.2987751

    Article  Google Scholar 

  73. Ignatov, A. (2018). Real-time human activity recognition from accelerometer data using Convolutional Neural Networks. Applied Soft Computing, 62, 915–922. https://doi.org/10.1016/j.asoc.2017.09.027

    Article  Google Scholar 

  74. Hochreiter, S., & Schmidhuber, J. (1997). Long Short-term Memory. Neural Computation, 9, 1735–1780. https://doi.org/10.1162/neco.1997.9.8.1735

    Article  Google Scholar 

  75. Murad, A., & Pyun, J.-Y. (2017). Deep recurrent neural networks for human activity recognition. Sensors, 17(11), 2556. https://doi.org/10.3390/s17112556

    Article  Google Scholar 

  76. Zebin, T., Peek, N. Casson, A., Sperrin, M. (2018). Human activity recognition from inertial sensor time-series using batch normalized deep LSTM recurrent networks, 2018. Doi: https://doi.org/10.1109/EMBC.2018.8513115.

  77. Donahue J. et al., (2016). Long-term recurrent convolutional networks for visual recognition and description ArXiv14114389 Cs. Accessed: Dec. 11, 2019. [Online]. Available: http://arxiv.org/abs/1411.4389.

  78. Sainath, T. N., Vinyals, O., Senior, A., Sak, H. (2015). Convolutional, long short-term memory, fully connected deep neural networks. In 2015 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), pp. 4580–4584. Doi: https://doi.org/10.1109/ICASSP.2015.7178838.

  79. Vinyals, O., Toshev, A., Bengio, S. Erhan, D. (2015). Show and tell: a neural image caption generator. ArXiv14114555 Cs, Apr. 2015, Accessed: Dec. 11, 2019. [Online]. Available: http://arxiv.org/abs/1411.4555.

  80. Ordóñez, F. J., & Roggen, D. (2016). Deep convolutional and LSTM recurrent neural networks for multimodal wearable activity recognition. Sensors, 16(1), 115. https://doi.org/10.3390/s16010115

    Article  Google Scholar 

  81. $$$F. Li, K. Shirahama, M. A. Nisar, L. Köping, and M. Grzegorzek, (2018). Comparison of feature learning methods for human activity recognition using wearable sensors. Sensors, vol. 18, no. 2, Art. no. 2, Feb. 2018. Doi: https://doi.org/10.3390/s18020679.

  82. Kasnesis, P., Patrikakis, C. Z., & Venieris, I. S. (2018). PerceptionNet: A deep convolutional neural network for late sensor fusion. Intelligent Systems and Applications, Cham, 868, 101–119. https://doi.org/10.1007/978-3-030-01054-6_7

    Article  Google Scholar 

  83. Pienaar, S. W., & Malekian, R. (2019). Human activity recognition using lstm-rnn deep neural network architecture. 2019 IEEE 2nd Wireless Africa Conference (WAC). Pretoria, South Africa, Aug., 2019, 1–5. https://doi.org/10.1109/AFRICA.2019.8843403

    Article  Google Scholar 

  84. Buenaventura, C.V.S., Tiglao, N. M. C., Atienza, R. O. (2018). Deep learning for smartphone-based human activity recognition using multi-sensor fusion. Springerprofessional.de, Cham, 2018, pp. 65–75, Accessed: Apr. 03, 2020. [Online]. Available: https://www.springerprofessional.de/en/deep-learning-for-smartphone-based-human-activity-recognition-us/16376826.

  85. Sun, J., Fu, Y., Li, S., He, J., Xu, C., & Tan, L. (2018). Sequential human activity recognition based on deep convolutional network and extreme learning machine using wearable sensors. J. Sens., 2018, 1–10. https://doi.org/10.1155/2018/8580959

    Article  Google Scholar 

  86. Zhao, Y., Yang, R., Chevalier, G., Xu, X., & Zhang, Z. (2018). Deep residual Bidir-LSTM for human activity recognition using wearable sensors. Mathematical Problems in Engineering, 2018, 1–13. https://doi.org/10.1155/2018/7316954

    Article  Google Scholar 

  87. Abedin, A., Rezatofighi, S. H., Shi, Q., Ranasinghe, D. C. (2019). SparseSense: Human activity recognition from highly sparse sensor data-streams using set-based neural networks. ArXiv190602399 Cs Stat Accessed: Apr. 06, 2020. [Online]. Available: http://arxiv.org/abs/1906.02399.

  88. Almaslukh, B., Artoli, A., & Al-Muhtadi, J. (2018). A robust deep learning approach for position-independent smartphone-based human activity recognition. Sensors, 18(11), 3726. https://doi.org/10.3390/s18113726

    Article  Google Scholar 

  89. Dong, M., Han, J., He, Y., & Jing, X. (2018). HAR-Net: Fusing deep representation and hand-crafted features for human activity recognition. Signal and Information Processing, Networking and Computers, 550, 32–40. https://doi.org/10.1007/978-981-13-7123-3_4

    Article  Google Scholar 

  90. Russell, R., & Norvig, P. (1999). Artificial Intelligence: A Modern Approach. (3rd ed.). One Lake Street Upper Saddle River, NJUnited States: Prentice Hall Press.

    MATH  Google Scholar 

  91. Rokni, S.-A., Nourollahi, M., Ghasemzadeh, H. Personalized human activity recognition using convolutional neural networks, p. 2.

  92. Hammerla, N. Y., Halloran, S., Plotz, T. (2020). Deep, convolutional, and recurrent models for human activity recognition using wearables. In Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence Accessed: Apr. 03, 2020. [Online]. Available: https://dl.acm.org/Doi/https://doi.org/10.5555/3060832.3060835.

  93. San, P.P., Kakar, P., Li, X.-L., Krishnaswamy, S., Yang, J.-B., Nguyen, M. N. (2017). Chapter 9—Deep learning for human activity recognition. Big Data Analytics for Sensor-Network Collected Intelligence, H.-H. Hsu, C.-Y. Chang, and C.-H. Hsu, Eds. Academic Press, pp. 186–204.

  94. Gudur, G. K., Sundaramoorthy, P., Umaashankar, V. (2019). ActiveHARNet: Towards On-Device deep bayesian active learning for human activity recognition, pp. 7–12. Doi: https://doi.org/10.1145/3325413.3329790.

  95. Long, J., Sun, W., Yang, Z., & Raymond, O. I. (2019). Asymmetric residual neural network for accurate human activity recognition. Information. https://doi.org/10.3390/info10060203

    Article  Google Scholar 

  96. Zeng, M., Gao, H., Yu, T., Mengshoel, O.J., Langseth, H., Lane, I. and Liu, X., (2018). Understanding and improving recurrent networks for human activity recognition by continuous attention. In Proceedings of the 2018 ACM International Symposium on Wearable Computers. https://dl.acm.org/Doi/https://doi.org/10.1145/3267242.3267286.

  97. Chen, C., Li, K., Duan, M., Li, K. (2017). Chapter 6—extreme learning machine and its applications in big data processing. Big Data Analytics for Sensor-Network Collected Intelligence, H.-H. Hsu, C.-Y. Chang, and C.-H. Hsu, Eds. Academic Press, pp. 117–150.

  98. Yang, F., Zhang, W., Tao, L., Ma, J. (2020). Transfer learning strategies for deep learning-based PHM algorithms. Applied Science, 10(7), Art. no. 7. Doi: https://doi.org/10.3390/app10072361.

  99. Tan, C., Sun, F., Kong, T., Zhang, W., Yang, C., Liu, C. (2018). A survey on deep transfer learning. ArXiv180801974 Cs Stat Accessed: Feb. 08, 2021. [Online]. Available: http://arxiv.org/abs/1808.01974.

  100. Cao, H., Nguyen, M.N., Phua, C., Krishnaswamy, S., Li, X.-L. (2012). An integrated framework for human activity classification. In Proceedings of the 2012 ACM Conference on Ubiquitous Computing, New York, NY, USA, pp. 331–340. Doi: https://doi.org/10.1145/2370216.2370268.

  101. Jiang, W., Yin, Z. (2015). Human activity recognition using wearable sensors by deep convolutional neural networks. In Proceedings of the 23rd ACM international conference on Multimedia (pp. 1307–1310). [Online]. Available: https://dl.acm.org/citation.cfm?id=2806333.

  102. Kumar, R.C., Bharadwaj, S.S., Sumukha, George, K. (2016). Human activity recognition in cognitive environments using sequential ELM. In 2016 Second International Conference on Cognitive Computing and Information Processing (CCIP) (pp. 1–6). Doi: https://doi.org/10.1109/CCIP.2016.7802880.

  103. Alsheikh, M. A., Selim, A., Niyato, D., Doyle, L., Lin, S., & Tan, H.-P. (2016). Deep Activity Recognition Models with Triaxial Accelerometers. (p. 6). AAAI Workshop Artif: Intell. Appl. Assist. Technol. Smart Environ.

    Google Scholar 

  104. San-Segundo, R., Montero, J. M., Barra-Chicote, R., Fernández, F., & Pardo, J. M. (2016). Feature extraction from smartphone inertial signals for human activity segmentation. Signal Processing, 120, 359–372. https://doi.org/10.1016/j.sigpro.2015.09.029

    Article  Google Scholar 

  105. Ronao, C., & Cho, S.-B. (2016). Human activity recognition with smartphone sensors using deep learning neural networks. Expert Systems with Applications, https://doi.org/10.1016/j.eswa.2016.04.032

    Article  Google Scholar 

  106. Jiang, W., Yin, Z. (2015). Human activity recognition using wearable sensors by deep convolutional neural networks. pp. 1307–1310. Doi: https://doi.org/10.1145/2733373.2806333.

  107. Zhu, X., & Qiu, H. (2016). High accuracy human activity recognition based on sparse locality preserving projections. PLoS ONE, 11, e0166567. https://doi.org/10.1371/journal.pone.0166567

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Nida Saddaf Khan.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Khan, N.S., Ghani, M.S. A Survey of Deep Learning Based Models for Human Activity Recognition. Wireless Pers Commun 120, 1593–1635 (2021). https://doi.org/10.1007/s11277-021-08525-w

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11277-021-08525-w

Keywords

Navigation