Abstract
In the field of ubiquitous computing, machines need to be aware of the present context to enable anticipatory communication with humans. This leads to human-centric applications that have the primary objective of improving the Quality-of-Life (QoL) of its users. One important type of context information for these applications is the current activity of the user, which can be derived from environmental and wearable sensors. Due to the processing capabilities and the number of sensors embedded in a smartphone, this device exhibits the most promise among other existing technologies in human activity recognition (HAR) research. While machine learning-based solutions have been successful in past HAR studies, several design struggles can be easily resolved with deep learning. In this paper, we investigated Convolutional Neural Networks and Long Short-Term Memory Networks in dealing with common challenges in smartphone-based HAR, such as device location and subject dependency, and manual feature extraction. We showed that the CNN model accomplished location- and subject-independent recognition with overall accuracy of 98.38% and 90.61%, respectively. The LSTM model also performed location-independent recognition with an accuracy of 97.17% but has a subject-independent recognition accuracy of only 80.02%. Finally, optimal performance of the network was achieved by performing Bayesian Optimization using Gaussian Processes in tuning the design hyperparameters.
Keywords
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsReferences
Zhang, Y., Markovic, S., Sapir, I., Wagenaar, R.C., Little, T.D.: Continuous functional activity monitoring based on wearable tri-axial accelerometer and gyroscope. In: 2011 5th International Conference on Pervasive Computing Technologies for Healthcare and Workshops, Pervasive Health 2011, pp. 370–373 (2011). https://doi.org/10.4108/icst.pervasivehealth.2011.245966
Yamansavascilar, B., Amac Guvensan, M.: Activity recognition on smartphones: efficient sampling rates and window sizes, 1–6 (2016). https://doi.org/10.1109/percomw.2016.7457154
Altini, M., Penders, J., Amft, O.: Energy expenditure estimation using wearable sensors: a new methodology for activity-specific models. In: Proceedings—Wireless Health 2012, WH 2012 (2012). https://doi.org/10.1145/2448096.2448097
Rashidi, P., Mihailidis, A.: A survey for ambient-assisted living tools for older adults. IEEE J. Biomed. Health Inform. 17(3) (2013)
Khan, A.M., Tufail, A., Khattak, A.M., Laine, T.H.: Activity recognition on smartphones via sensor-fusion and KDA-based SVMs. Int. J. Distrib. Sens. Netw. 1–14 (2014). https://doi.org/10.1155/2014/503291
Zhu, C., Sheng, W.: Multi-sensor fusion for human daily activity recognition in robot-assisted living. In: Proceedings of the 4th ACM/IEEE International Conference on Human Robot Interaction—HRI 2009 (2009). https://doi.org/10.1145/1514095.1514187
San Buenaventura, C., Tiglao, N.: Basic human activity recognition based on sensor fusion in smartphones. In: IFIP/IEEE IM 2017 Workshop: 1st Workshop on Protocols, Applications and Platforms for Enhanced Living Environments (2017)
Vavoulas, G., Pediaditis, M., Chatzaki, C., Spanakis, E., Tsiknakis, M.: The mobifall dataset: fall detection and classification with a smartphone. Int. J. Monit. Surveill. Technol. Res. 2, 44–56 (2016). https://doi.org/10.4018/ijmstr.2014010103
Pires, I., Garcia, N., Pombo, N., Flórez-Revuelta, F.: From data acquisition to data fusion: a comprehensive review and a roadmap for the identification of activities of daily living using mobile devices. Sensors 16(2), 184 (2016). https://doi.org/10.3390/s16020184
Zebin, T., Scully, P.J., Ozanyan, K.B.: Human activity recognition with inertial sensors using a deep learning approach. In: 2016 IEEE Sensors (2016). https://doi.org/10.1109/icsens.2016.7808590
Shoaib, M., Bosch, S., Incel, O., Scholten, H., Havinga, P.: Fusion of smartphone motion sensors for physical activity recognition. Sensors 14(6), 10146–10176 (2014). https://doi.org/10.3390/s140610146
Wen, J., Loke, S., Indulska, J., Zhong, M.: Sensor-based activity recognition with dynamically added context. In: Mihaela, U., Valeriy, V. (eds) 12th EAI International Conference on Mobile and Ubiquitous Systems: Computing, Networking and Services MOBIQUITOUS 2015. International Conference on Mobile and Ubiquitous Systems: Computing, Networking and Services, pp. e4.1–e4.10, Coimbra, Portugal, 22–24 July 2015 (2015). https://doi.org/10.4108/eai.22-7-2015.2260164
Acknowledgement
The authors acknowledge the financial support of the University of the Philippines and Department of Science and Technology through the Engineering for Research and Development for Technology (ERDT) Program.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2019 ICST Institute for Computer Sciences, Social Informatics and Telecommunications Engineering
About this paper
Cite this paper
San Buenaventura, C.V., Tiglao, N.M.C., Atienza, R.O. (2019). Deep Learning for Smartphone-Based Human Activity Recognition Using Multi-sensor Fusion. In: Chen, JL., Pang, AC., Deng, DJ., Lin, CC. (eds) Wireless Internet. WICON 2018. Lecture Notes of the Institute for Computer Sciences, Social Informatics and Telecommunications Engineering, vol 264. Springer, Cham. https://doi.org/10.1007/978-3-030-06158-6_7
Download citation
DOI: https://doi.org/10.1007/978-3-030-06158-6_7
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-06157-9
Online ISBN: 978-3-030-06158-6
eBook Packages: Computer ScienceComputer Science (R0)