Abstract
People generally perform various activities, such as walking and running. They perform these activities with different motions. For example, walking can be performed with or without swinging shoulders, as well as staggering and swinging arms. We assume that such differences occur based on physical and mental characteristics of humans. To analyze relations between the motions and the characteristics/conditions, it is useful to group humans according to these differences. In a previous work, we proposed a method that successfully grouped humans by analyzing accelerometer data of their bodies in a specific activity with fixed timing and duration. In this study, we tackle with a problem of grouping human in generic, variable-length activities, such as walking and running. We propose a method that detects same motions from the accelerometer data with sliding windows and merges continuous same motions into a motion. The method is robust regarding the difference in timing and duration of the motion. In our conducted experiments, the proposed method classified humans into groups appropriately, the groups which are acquired by the previous method with the same data but without assuming fixed timing and fixed duration, which are assumed in the previous method. The proposed method is robust against temporally noised data generated from the data.



Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.References
Shima K, Moriyama K, Mutoh A, Inuzuka N (2017) Grouping people by differences in motions based on string representation of accelerometer values (in Japanese). IPSJ Trans Math Model Appl (TOM) 10(2):51–58
Kulić D, Kragic D, Krüger V (2011) Learning action primitives. In: Moeslund TB, Hilton A, Krüger V, Sigal L (eds) Visual analysis of humans: looking at people. Springer, London, London, pp 333–353
Husz ZL, Wallace AM, Green PR (2007) Human activity recognition with action primitives. In Proceedings of 2007 IEEE conference on advanced video and signal based surveillance, pages 330–335
Pham NH, Yoshimi T (2016) A proposal of extracting of motion primitives by analyzing tracked data of hand motion from human demonstration. In: Proceedings of ISR 2016: 47st international symposium on robotics, pages 1–6
Ueura S, Iwai Y, Yachida M (2008) Extracting action primitives by semi-supervised clustering (in Japanese). IPSJ SIG Notes. CVIM 163:29–36
Kikuchi N, Matsuda K (2018) A proposal of analysis method of difference in quality in morioka sansa odori using histogram (in Japanese). JSAI SIG Notes. SKL 25(06):29–35
Torigoe Y, Takata M, Nakamura Y, Fujimoto M, Arakawa Y, Yasumoto K (2019) Strikes-thrusts activity recognition using IMUs towards the realization of kendo skill support system (in Japanese). IPSJ SIG Notes. UBI 61(37):1–7
Iwata A, Kawashima H, Okoshi T, Nakazawa J (2020) Strikes-thrusts activity recognition using IMUs towards the realization of kendo skill support system (in Japanese). IPSJ SIG Notes. UBI 67(27):1–8
Itoh Y, Kojima K, Chiba K, Hayashi K (2020) Key points for predicting kick direction in penalty kick and improving the prediction accuracy of goal keepers (in Japanese). IPSJ Trans Digital Practice (TDP) 1(1):1–7
Zhang P, Xie W, Ou J, Zhang J, Liu K, Wang G (2019) Research on human micro-motion feature extraction technology. In: Proceedings of 2019 IEEE advanced information management, communicates, electronic and automation control conference
Author information
Authors and Affiliations
Corresponding author
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations
This work was presented in part at the 25th International Symposium on Artificial Life and Robotics (Beppu, Oita, January 22-24, 2020).
About this article
Cite this article
Shima, K., Mutoh, A., Moriyama, K. et al. Human motion analysis using expressions of non-separated accelerometer values as character strings. Artif Life Robotics 26, 202–209 (2021). https://doi.org/10.1007/s10015-020-00668-6
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10015-020-00668-6