Skip to main content

Activity Recognition from Inertial Sensors with Convolutional Neural Networks

  • Conference paper
  • First Online:
Future Data and Security Engineering (FDSE 2017)

Part of the book series: Lecture Notes in Computer Science ((LNISA,volume 10646))

Included in the following conference series:

Abstract

Human Activity Recognition is one of the attractive topics to develop smart interactive environment in which computing systems can understand human activities in natural context. Besides traditional approaches with visual data, inertial sensors in wearable devices provide a promising approach for human activity recognition. In this paper, we propose novel methods to recognize human activities from raw data captured from inertial sensors using convolutional neural networks with either 2D or 3D filters. We also take advantage of hand-crafted features to combine with learned features from Convolution-Pooling blocks to further improve accuracy for activity recognition. Experiments on UCI Human Activity Recognition dataset with six different activities demonstrate that our method can achieve 96.95%, higher than existing methods.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Anguita, D., Ghio, A., Oneto, L., Parra, X., Reyes-Ortiz, J.L.: A public domain dataset for human activity recognition using smartphones. In: 21st European Symposium on Artificial Neural Networks, ESANN 2013, Bruges, Belgium, 24–26 April 2013 (2013)

    Google Scholar 

  2. Bulling, A., Blanke, U., Schiele, B.: A tutorial on human activity recognition using body-worn inertial sensors. ACM Comput. Surv. 46(3), 33:1–33:33 (2014)

    Article  Google Scholar 

  3. Cook, D.J., Youngblood, M., Heierman, E.O., Gopalratnam, K., Rao, S., Litvin, A., Khawaja, F.: MavHome: an agent-based smart home. In: Proceedings of the First IEEE International Conference on Pervasive Computing and Communications, (PerCom 2003), pp. 521–524, March 2003

    Google Scholar 

  4. Davis, K., Owusu, E., Bastani, V., Marcenaro, L., Hu, J., Regazzoni, C., Feijs, L.: Activity recognition based on inertial sensors for ambient assisted living. In: 2016 19th International Conference on Information Fusion (FUSION), pp. 371–378, July 2016

    Google Scholar 

  5. Glorot, X., Bordes, A., Bengio, Y.: Deep sparse rectifier neural networks. In: Gordon, G.J., Dunson, D.B., Dudík, M. (eds.) AISTATS, JMLR Proceedings, vol. 15, pp. 315–323 (2011). JMLR.org

  6. Jiang, W., Yin, Z.: Human activity recognition using wearable sensors by deep convolutional neural networks. In: Proceedings of the 23rd ACM International Conference on Multimedia, MM 2015, pp. 1307–1310. ACM, New York (2015)

    Google Scholar 

  7. LeCun, Y., Kavukcuoglu, K., Farabet, C.: Convolutional networks and applications in vision. In: ISCAS, pp. 253–256. IEEE (2010)

    Google Scholar 

  8. Martin, S., Kelly, G., Kernohan, W.G., McCreight, B., Nugent, C.: Smart home technologies for health and social care support. Cochrane Database Syst. Rev. 4 (2008)

    Google Scholar 

  9. Ngo, T.T., Makihara, Y., Nagahara, H., Mukaigawa, Y., Yagi, Y.: Similar gait action recognition using an inertial sensor. Pattern Recogn. 48(4), 1289–1301 (2015)

    Article  Google Scholar 

  10. Oreifej, O., Liu, Z.: HON4D: Histogram of oriented 4D normals for activity recognition from depth sequences. In: CVPR, pp. 716–723. IEEE Computer Society (2013)

    Google Scholar 

  11. Reyes-Ortiz, J.L., Oneto, L., Samà, A., Parra, X., Anguita, D.: Transition-aware human activity recognition using smartphones. Neurocomput. 171(C), 754–767 (2016)

    Article  Google Scholar 

  12. Ronao, C.A., Cho, S.B.: Human activity recognition with smartphone sensors using deep learning neural networks. Expert Syst. Appl. 59, 235–244 (2016)

    Article  Google Scholar 

  13. Srivastava, N., Hinton, G., Krizhevsky, A., Sutskever, I., Salakhutdinov, R.: Dropout: a simple way to prevent neural networks from overfitting. J. Mach. Learn. Res. 15(1), 1929–1958 (2014)

    MATH  MathSciNet  Google Scholar 

  14. Zhang, B., Wang, L., Wang, Z., Qiao, Y., Wang, H.: Real-time action recognition with enhanced motion vector CNNs. In: 2016 IEEE Conference on Computer Vision and Pattern Recognition, CVPR 2016, Las Vegas, NV, USA, 27–30 June 2016, pp. 2718–2726 (2016)

    Google Scholar 

Download references

Acknowledgement

This research is funded by Vietnam National University HoChiMinh City (VNU-HCM) under grant number B2015-18-01.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Minh-Triet Tran .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2017 Springer International Publishing AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Ha, QD., Tran, MT. (2017). Activity Recognition from Inertial Sensors with Convolutional Neural Networks. In: Dang, T., Wagner, R., Küng, J., Thoai, N., Takizawa, M., Neuhold, E. (eds) Future Data and Security Engineering. FDSE 2017. Lecture Notes in Computer Science(), vol 10646. Springer, Cham. https://doi.org/10.1007/978-3-319-70004-5_20

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-70004-5_20

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-70003-8

  • Online ISBN: 978-3-319-70004-5

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics