Elsevier

Pattern Recognition Letters

Volume 118, 1 February 2019, Pages 51-60
Pattern Recognition Letters

A new paradigm for autonomous human motion description and evaluation: Application to the Get Up & Go test use case

https://doi.org/10.1016/j.patrec.2018.02.003Get rights and content

Highlights

  • A Human Motion Analysis system able to be mounted on a social robot is proposed.

  • Motion is segmented into actions and evaluated using a parametric approach.

  • A completely autonomous evaluation of the Get Up & Go test is achieved.

  • Obtained results have been validated through a quantitative evaluation.

  • Results show that the system may be used for screening and monitoring purposes.

Abstract

Human Motion Analysis is receiving a growing attention in the field of assistive technologies. Portable systems, able to be carried home or mounted on socially assistive robots, can help in monitoring and evaluating the autonomy level of elderly people in the upcoming silver society. This paper presents a new paradigm to describe and evaluate human motion that can be used in these scenarios. The proposal is based on parametric segmentation and evaluation of action primitives. These actions can be combined in different sequences or even evaluated in parallel, providing a modular solution that can easily adapt to the analysis of new behaviours or motion tests. The particular use case of the Get Up & Go test has been used to study the validity of the proposal. Autonomous evaluation of the gaits of different performers have been achieved using data captured by a Kinect 2.0 device mounted on a social robot. Experiments have also involved gait data captured with a precise Vicon Nexus system based on markers, to compare with previous results and characterize capture errors of the Kinect device. Results show that the proposed system is adequate to be used in these scenarios.

Introduction

In the last twenty years, the applications and research interest in the Human Motion Analysis (HMA) field have grown significantly. It has become a focus topic for researchers in virtual modelling, rehabilitation processes, ergonomics, gait analysis, robotics or surveillance applications, amongst others [3], [14], [21]. These applications are usually classified depending on the required level of precision, this level being inversely proportional to the degree of invasiveness and imposed environmental constraints.

However, new application fields – that are growingly demanding – are arising for HMA. Among them, a particularly interesting one, due to the current evolution of the world population. According to the estimations of the United Nations, by 2050, one out of every five people in the world will be over 60 years old [6]. For this silver society, it is necessary to design and implement models that help elderly people age healthily and maintain their autonomy and well-being. These models imply multidisciplinary approaches, in which social, medical or engineering dimensions have to be considered. Thus, the Active Ageing approach requires the development of autonomous systems to monitor the status and activities of a person, without interfering with them. Motion evaluation becomes an important feature for these systems: falls (or risk of falling), manipulation issues or motion impairments are among the key causes of autonomy loss among the elderly population. Consequently, motion tests are a key part of Comprehensive Geriatric Assessment (CGA) procedures, designed to capture data on the medical, psychosocial and functional capabilities and limitations of elderly people [13].

Autonomous tools – that evaluate human motion, both in their daily life activities and performing motion clinical tests – will have to be precise, but yet, will avoid imposing any constraints on the patient. E.g., no special environments, markers or garments should be used. Ideally the point of view from which motion sequences are captured should not be imposed. This is to allow the performing of these evaluations, from any monitoring camera, in the person’s environment (e.g. her house). In practice, however, the problem has to become achievable. Portable devices appear as an interesting solution to facilitate home rehabilitation, as the person can locate them in a proper spot to capture her motion [11]. On the other hand, socially assistive robots [7], among other important characteristics, are also able to carry sensors, and identify where the person is, and capture her motion from a certain perspective. They can also be equipped with the processors required to execute HMA algorithms, that are often computationally expensive [14].

This paper proposes a new paradigm for autonomous human motion description and evaluation, which has been designed as a general system, able to adapt to different data inputs and scenarios. The proposed HMA system detects a set of different actions in a certain motion, and combines the evaluation of these actions to provide an integrated score for the complete motion. Expert knowledge is used to perform the motion segmentation and evaluation processes. Actions are stored in a library, allowing the medical specialist to (i) use them as components to create new motion exercises; or (ii) autonomously search for particular actions in a perceived motion. It has been developed within the framework of the CLARC EU project.1

The rest of the paper is organized as follows: Section 2 describes recent research works, in the field of HMA, related to the proposed paradigm. This paradigm is described in Section 3. In order to provide a practical evaluation framework for the proposal, it has been used to evaluate human gait in the Get Up & Go test, commonly employed in CGA processes. Section 4 introduces the test, and details as regards the splitting and evaluation criteria for the actions that compose it. Section 5 describes the experiments performed to test the validity of the approach, using two different Motion Capture (MoCap) systems: a Vicon Nexus 1.8.5 MoCap system, and a Kinect 2.0 device mounted on CLARC, a socially assistive robot for CGA processes developed in the CLARC project. Section 6 discusses the obtained results and concludes the paper.

Section snippets

State of the art

The HMA proposed in this paper is intended to be integrated in portable devices or socially assistive robots, that capture human motion in daily life environments. Traditionally, MoCap systems that meet these requirements are vision-based systems, that employ data provided by a single camera or a pair of stereo cameras. The survey of Moeslund et al. [14] describes these systems, that can be basically divided into model-free approaches, that directly map visual perception to pose space, and

Proposed Human Motion Analysis system

The proposed HMA approach divides a complete motion, G, into a set of discrete actions, ai, to be evaluated. Known actions are stored in an action library, called ActionsLib. Therefore, a complete motion, G, can be defined as an ordered combination of actions stored in this library C({ai}). Fig. 1 shows an example of a motion divided into actions. As depicted, different actions can be executed sequentially, but they can also overlap (e.g. the action ‘wave hand to say hello’ may be simultaneous

The Get Up And Go test: use case

Gait/balance disorders are the second cause of falls in elderly adults [18], and is a major public health issue. The Get Up And Go test [12] is designed to detect these disorders. In this test, the patient is asked to stand up from a chair, walk in a straight line for around three meters, turn back, return to the chair and sit down. The goal is to measure balance, detecting deviations from a confident, normal performance. Different factors influence this measure, including symmetry, bending or

Experiments

The proposed HMA system has been analyzed through three sets of experiments, in which the gait of different people performing the Get Up & Go test is processed using the proposed HMA system. The dataset of all the experiments presented in the paper is available at the web page of the CLARC EU Project.3

In all experiments human motion has been captured using a Kinect 2.0 device mounted on a socially assistive robot. The robot and the person were located

Conclusions

Results show that the proposed HMA system is able to correctly evaluate human motion. The system requires the complete gait to be perceived before evaluating it, but once the gait is captured the analytic nature of the algorithm allows producing fast responses. The algorithm is autonomous and it does not impose any constraints on the performer nor the environment. Experiments have involved successful autonomous evaluation of human gait in the Get Up & Go test. These results have been validated

Acknowledgements

This work has been partially funded by the European Union ECHORD++ project (FP7-ICT-601116) and the TIN2015-65686-C5-4-R Spanish Ministerio de Economía y Competitividad project and FEDER funds. The authors warmly thank the members of the “Amis du Living Lab” community, and the patients and medical staff at Hospital Civil de Málaga, for their participation in this research, as well as the interns Marion Olivier and Daniel Saadeddine.

References (22)

  • L. Chen et al.

    A survey of human motion analysis using depth imagery

    Pattern Recognit. Lett.

    (2013)
  • T.B. Moeslund et al.

    A survey of advances in vision-based human motion capture and analysis

    Comput. Vision Image Understanding

    (2006)
  • J.P. Bandera

    Vision-Based Gesture Recognition in a Robot Learning by Imitation Framework

    (2010)
  • R.W. Bohannon

    Comfortable and maximum walking speed of adults aged 20–79 years: reference values and determinants

    Age Ageing

    (1997)
  • E. Cippitelli et al.

    Kinect as a tool for gait analysis: validation of a real-time joint extraction algorithm working in side view

    Sensors

    (2015)
  • P.D.L. Pinto, Calibration of kinect for xbox one and comparison between the two generations of microsoft sensors,...
  • DG-ECFIN et al.

    The 2015 Ageing Report: Underlying Assumptions and Projection Methodologies

    Technical report

    (2014)
  • D. Feil-Seifer et al.

    Defining socially assistive robotics

    Proceedings of the 2005 IEEE C9th International Conference on Rehabilitation Robotics

    (2005)
  • E. Ghorbel et al.

    3D real-time human action recognition using a spline interpolation approach

    Proceedings of the 2015 International Conference on Image Processing Theory, Tools and Applications

    (2015)
  • B.K.P. Horn

    Closed-form solution of absolute orientation using unit quaternions

    J. Opt. Soc. Am. A.

    (1987)
  • A.H. Kargar et al.

    Automatic measurement of physical mobility in get-up-and-go test using kinect sensor

    Proceedings of Annual International Conference of the IEEE Engineering in Medicine and Biology Society, 201

    (2014)
  • Cited by (0)

    View full text