skip to main content
10.1145/3341162.3345591acmconferencesArticle/Chapter ViewAbstractPublication PagesubicompConference Proceedingsconference-collections
research-article

Reduction of marker-body matching work in activity recognition using motion capture

Published: 09 September 2019 Publication History

Abstract

In this paper, activity recognition is performed using an optical motion capture system that can measure three-dimensional position information of reflective markers attached to the body. The individual markers detected by motion capture are automatically associated with which part of the body they are attached to. However, due to the overlapping of obstacles and other body parts and misplacement of the markers, these may be hidden from the camera and enter a blind spot, which may frequently cause a marker to be associated to different body parts erroneously. Usually, these errors need to be corrected manually after measurement, but this work is very time consuming, cumbersome and requires some skill. In this research, it is thought that there is no problem in recognizing the activity even if the process of spending the effort of correcting the correspondence between the marker after measurement and the body is omitted in the activity recognition using the motion capture. Because feature quantities are extracted from activity data when performing action recognition, even if an error occurs in part of the marker data, the effect is small because the correct feature quantities are selected and other marker data can compensate for an error. In addition, in this paper, we proposed a method to recognize the activity using the data when the human body template preparation required before Mocap data measurement is omitted, which is one of marker body matching work. The verification showed that even if the marker body matching operation was omitted, it was possible to recognize the action with high accuracy.

References

[1]
Sozo Inoue, Human Sensing Using Wearable Sensor, journal of Japan Society for Fuzzy Theory and Intelligent Informatics, 28:6 pp. 170--186, 2016
[2]
National University Corporation Kyushu Institute of Technology:Experiments on Activity Recognition and Prediction by IoT of Nursing Home Workers,Kyushu Institute of Technology (online), from http://www.kyutech.ac.jp/archives/025/201810/press_181026.pdf.
[3]
S.Dernbach, B.Das, N.C.Krishnan, B.L.Thomas, and D.J.Cook.2012,Simple and Complex Activity Recognition through Smart Phones, In 2012 8th Int.Conf.on Int. Env.214--221.
[4]
Burnett, D.R., Campbell-Kyureghyan, N.H., Topp, R.V., and Quesada, P.M. (2015), Biomechanics of lower limbs during walking among candidates for total knee arthroplasty with and without low back pain. BioMed Research International, 2015.
[5]
Ryo Miyagi, Hironori Hiraishi. (2015), Activity Detection for Human Motion Detection Sensor Using Machine Learning, 77th National Congress Lecture Proceedings,2015 (1), 313--314.
[6]
Keita Nakahara, Hirozumi Yamaguchi, Teruo Higashino. (2016). In-home Activity Logging Using Mobile Sensors and Kinect. Proceedings of the IPSJ Kansai Chapter Branch Conference, 2016.
[7]
Wei Shen, Ke Deng, Xiang Bai, T. Leyvand, Baining Guo, and Zhuowen Tu. Exemplarbased human action pose correction and tagging. In Computer Vision and Pattern Recognition (CVPR), 2012 IEEE Conference on, pp.1784âĂŞ1791, June 2012.
[8]
Jinna Lei, Xiaofeng Ren, and Dieter Fox. Finegrained kitchen activity recognition using rgb-d. In Proceedings of the 2012 ACM Conference on Ubiquitous Computing, UbiComp'12, pp.208--211, New York, NY, USA, 2012. ACM.
[9]
Sozo Inoue, Paula Lago, Shingo Takeda, AliaShamma, Farina Faiz, Nattaya Mairittha, and Tittaya Mairittha. 2019. Nurse Care Activity Recognition Challenge.
[10]
D. Holden. Robust Solving of Optical Motion Capture Data by Denoising. ACM Transactions on Graphics, 2018.

Cited By

View all
  • (2020)Improvement of Human Action Recognition Using 3D Pose EstimationActivity and Behavior Computing10.1007/978-981-15-8944-7_2(21-37)Online publication date: 24-Dec-2020

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
UbiComp/ISWC '19 Adjunct: Adjunct Proceedings of the 2019 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2019 ACM International Symposium on Wearable Computers
September 2019
1234 pages
ISBN:9781450368698
DOI:10.1145/3341162
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 09 September 2019

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. activity recognition
  2. machine learning
  3. motion capture

Qualifiers

  • Research-article

Conference

UbiComp '19

Acceptance Rates

Overall Acceptance Rate 764 of 2,912 submissions, 26%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)11
  • Downloads (Last 6 weeks)0
Reflects downloads up to 15 Feb 2025

Other Metrics

Citations

Cited By

View all
  • (2020)Improvement of Human Action Recognition Using 3D Pose EstimationActivity and Behavior Computing10.1007/978-981-15-8944-7_2(21-37)Online publication date: 24-Dec-2020

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media