Abstract
In this paper, we present a personalized and real-time prototyping solution on smart glasses targeting activity recognition. Our work is based on the analysis of sensor data to study user’s motions and activities, while utilizing wearable glasses bundled with various sensors. The software system collects, trains data, and builds the model for fast classification, which emphasizes on how specific features annotate and extract head-mounted behavior. Based on our feature selection algorithm, the system reaches high accuracy and low computation cost in the experiments. Other than some previous works in data mining on sensors of smart phones or smart glasses, and related works of activity recognition on smartphones, our results show the accuracy achieves 87 %, and the responsive time is less than 3 s. The proposed system can provide more insightful and powerful services for the glass wearers. It would be possibly expected to carry out more user-centric and context-aware wearable applications in the future.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
Notes
References
Lockhart, J.W., Pulickal, T., Weiss, G.M.: Applications of mobile activity recognition. In: 2012 Proceedings of the ACM Conference on Ubiquitous Computing. ACM (2012)
McNaney, R., et al.: Exploring the acceptability of google glass as an everyday assistive device for people with Parkinson’s. In: Proceedings of the 32nd Annual ACM Conference on Human Factors in Computing Systems. ACM (2014)
Adomavicius, G., Tuzhilin, A.: Context-aware recommender systems. In: Ricci, F., Rokach, L., Shapira, B., Kantor, P.B. (eds.) Recommender Systems Handbook, pp. 217–253. Springer, Berlin (2011)
Briem, V., Hedman, L.R.: Behavioural effects of mobile telephone use during simulated driving. Ergonomics 38(12), 2536–2562 (1995)
Ishimaru, S., et al.: Shiny: an activity logging platform for Google Glass. In: Proceedings of the 2014 ACM International Joint Conference on Pervasive and Ubiquitous Computing: Adjunct Publication. ACM (2014)
Gouveia, R., Karapanos, E.: Footprint tracker: supporting diary studies with lifelogging. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. ACM (2013)
Mazilu, S., et al.: GaitAssist: a wearable assistant for gait training and rehabilitation in Parkinson’s disease. In: 2014 IEEE International Conference on Pervasive Computing and Communications Workshops (PERCOM Workshops). IEEE (2014)
Khan, A.M., et al.: Human activity recognition via an accelerometer-enabled-smartphones using kernel discriminant analysis. In: 2010 5th International Conference on Future Information Technology (FutureTech). IEEE (2010)
Anjum, A., Ilyas, M.U.: Activity recognition using smartphones sensors. In: 2013 IEEE Consumer Communications and Networking Conference (CCNC). IEEE (2013)
Shoaib, M., Scholten, H., Havinga, P.J.: Towards physical activity recognition using smartphones sensors. In: IEEE 10th International Conference on Ubiquitous Intelligence and Computing, 2013 and 10th International Conference on Autonomic and Trusted Computing (UIC/ATC). IEEE (2013)
Kern, N., Schiele, B., Schmidt, A.: Multi-sensor activity context detection for wearable computing. In: Aarts, E., Collier, R.W., van Loenen, E., de Ruyter, B. (eds.) EUSAI 2003. LNCS, vol. 2875, pp. 220–232. Springer, Heidelberg (2003)
Randell, C., Muller, H.: Context awareness by analysing accelerometer data. In: The Fourth International Symposium on Wearable Computers. IEEE (2000)
Parkka, J., et al.: Activity classification using realistic data from wearable sensors. IEEE Trans. Inf Technol. Biomed. 10(1), 119–128 (2006)
Krause, A., Smailagic, A., Siewiorek, D.P.: Context-aware mobile computing: learning context-dependent personal preferences from a wearable sensor array. IEEE Trans. Mob. Comput. 5(2), 113–127 (2006)
Ishimaru, S., et al.: Smarter eyewear: using commercial EOG glasses for activity recognition. In: Proceedings of the 2014 ACM International Joint Conference on Pervasive and Ubiquitous Computing: Adjunct Publication. ACM (2014)
Teichmann, D., et al.: Human motion classification based on a textile integrated and wearable sensor array. Physiol. Meas. 34(9), 963 (2013)
LIBSVM: LIBSVM (2015). http://www.csie.ntu.edu.tw/~cjlin/libsvm/
Chen, Y.-W., Lin, C.-J.: Combining SVMs with various feature selection strategies. In: Guyon, I., Nikravesh, M., Gunn, S., Zadeh, L.A. (eds.) Feature Extraction, vol. 207, pp. 315–324. Springer, Berlin (2006)
Apple Watch: Apple Watch (2015). https://www.apple.com/tw/watch/
Epson Moverio BT-200: Epson 2016. http://www.epson.com/cgi-bin/Store/jsp/Landing/moverio-bt-200-smart-glasses.do
Sony SmartEyeglass Developer Edition: Sony (2016). http://developer.sonymobile.com/products/smarteyeglass/
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2016 Springer International Publishing Switzerland
About this paper
Cite this paper
Ho, J., Wang, CM. (2016). User-Centric and Real-Time Activity Recognition Using Smart Glasses. In: Huang, X., Xiang, Y., Li, KC. (eds) Green, Pervasive, and Cloud Computing. Lecture Notes in Computer Science(), vol 9663. Springer, Cham. https://doi.org/10.1007/978-3-319-39077-2_13
Download citation
DOI: https://doi.org/10.1007/978-3-319-39077-2_13
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-39076-5
Online ISBN: 978-3-319-39077-2
eBook Packages: Computer ScienceComputer Science (R0)