1 Introduction

The number of persons who has handicap or paralysis in daily living motion is increasing with years. As one of the reason, an increasement of the survival rate in stroke patients can be considered. In Japan, the survival rate was doubled in the past five decade [1, 2]. This results in that patients would have paralysis in the after-effect of the stroke. However, the paralysis has a variety. Daily motion consists of recognition, judgement, and reaction. Usually, only the reaction relating with motor function is focused as the target of rehabilitation. Recognition function is often ignored for grasping motion. The purpose of this study is to reveal the relationship of recognition and reaction. Especially, this study focuses on response time of recognition and reaction.

The target motion is to grasp objects. Grasping is an important motion in daily life, therefore, many researchers reports many kinds of rehabilitation for the grasping [3,4,5]. However, studies relating with recognition ability is rare [6, 7]. In the grasping motion, we firstly recognize the target. Then, we secondly grasp the target. If the reaction ability is low, naturally, success rate of the grasping might be low. But, we assume that the recognition ability also effects the success rate of the grasping. Therefore, we propose an evaluation system for recognition and motor functions in grasping motion.

2 Method

Human motion consists of recognition and motor functions. They firstly recognize an environment, then secondly move their body according with the environment. However, human motion ability decreases with aging called as sarcopenia. Generally, sarcopenia is thought to be with motor function. The relation with sarcopenia and recognition function is unclear. This study aims to reveal the relationship by using virtual reality system. This study focuses on recognition and motion times in grasping.

Unity (Unity 5.5.1.f1(64 bit), Unity Technologies Inc.) is used for constructing a virtual 3D environment as shown in Fig. 1. Eye tracker (Steel Series Sentry Gaming Eye Tracker 69041,) is used for measuring eye motion as shown in Fig. 2. Leap Motion (Leap Motion Controller, Leap Motion Inc.) is used for measuring finger motion as shown in Fig. 3.

Fig. 1.
figure 1

Unity

Fig. 2.
figure 2

Eye tracker

Fig. 3.
figure 3

Leap motion

In the experiment, subjects first calibrate Eye tracker. After that, on Unity, display the hand obtained from Eye tracker and the position obtained from LeapMotion. Both of them are checked to be correctly reflected in Unity. The experiment is measured from the state in which the subject gazes at the center point on Unity. Let the object (Fig. 4) appear three seconds after the start of measurement. The subjects move the line of sight and hand so that the hand on Unity grasps the object. After grasping, lowers the hand and gazes at the first center point. This is the experimental procedure. The x-y coordinate system (Fig. 5) of the eye motion and the bending angle of the finger were measured from the start to the end of the experiment.

Fig. 4.
figure 4

Target object

Fig. 5.
figure 5

Coordinate system

Since it is necessary for the subject to recognize and recognize the object and perform the actions to grasp naturally, the object is hide before starting measurement. The appearance position was also changed every time, and the object itself was also grasped in the same way with the original angle and rotated by 90° around the z axis as shown in Fig. 6. The shape of the target object adopts the shape of the iron dumbbell of a shape that the appearance seems to be heavy and easy to hold since the process until the subject guesses and grasps its center of gravity is important. The radius The x-y coordinates of the measured line of sight has the center point as (0, 0), with the upper right as positive and the lower left as negative. Also, let the coordinates of the movement of the center of gravity of the object be A (−100, 100), B (100, 100), C (−100, −100), D (100, −100) [mm]).

Fig. 6.
figure 6

Iron dumbbell direction

The virtual hand is displayed on the monitor with virtual display environment developed by Unity as shown in Fig. 7. The virtual hand moves real user hand by capturing the motion with Leap Motion. Leap Motion can obtain the 3D position of each joint of the hand. The obtained joint information can make the bending angle in grasping motion. The bending angle of the finger is obtained from the angle between the base phalanx of the middle finger and the metacarpal bone as shown in Fig. 8. By this way, time series data while grasping the objects is obtained.

Fig. 7.
figure 7

Virtual hand

Fig. 8.
figure 8

Bending angle

3 Experiments

3.1 Subjects

Experiments were conducted with five healthy volunteers (23 ± 1 year old, right handed), and each one had four positions of the 3D object (upper left, lower left, upper right, lower right on the display screen), two orientations (horizontal and Vertical direction) in total 8 times. The order of measurement was randomized to minimize the learning process.

3.2 Measurement Environment

Measurement environment is shown in Fig. 9. Each parameters are tabulated in Table 1. Leap Motion is set to be far from Eye Tracker so that the user hand does not prevent from tracking eye motion.

Fig. 9.
figure 9

Measurement environment

Table 1. Measurement parameters

3.3 Measurement Items

This study performs two experiments. The first one is to check characteristics eye motion without target. In this experiment, let subjects to look at the center of display. The second one is to track eye and grasping motions.

4 Results

4.1 Eye Motion Without Target

The result is shown in Table 2. This value shows the distance from the center of the display. A direction dependency to x and y axes was not observed. The maximum and minimum distances are 34.359 mm and 12.191 mm, respectively.

Table 2. The result of eye motion without target

4.2 Eye Motion and Finger Motion

The average of the time (t 1 ) from the appearance of the object of each subject to fixation and the average of the time (t 2 ) from fixation to grasp are calculated as shown in Fig. 10. The average of the errors of the x and y coordinates of the center of gravity of the first fixed fixture and the center of gravity of the object and t 1 for each of the x and y coordinates. Each response time of all the data is tabulated in Table 3. Table 4 shows the difference between target direction. Figures 11 and 12 show the distributions in 0° angle and 90° angle, respectively.

Fig. 10.
figure 10

An example of the motion data

Table 3. The results of response time
Table 4. The difference between target direction
Fig. 11.
figure 11

The distribution of fixation in 0°

Fig. 12.
figure 12

The distribution of fixation in 90°

About the errors in the coordinates of fixation and the coordinates of the center of gravity of the object, we can see that the first gazing coordinates are focused on the hand position of the object regardless of the orientation of the object. In response to this change, the object to be held was changed to a spherical object with no concept of orientation, and the same process as the experiment of this time was carried out to the subject. As for the result, the coordinates which fixed to the center of the object also concentrated here. From this result, when a person tries to grasp, he sees the center of the target object. After that, it can be inferred that after moving the line of sight, obtaining information such as the size of the object, the center of gravity, and so forth, it moves to the grasping action. The time during which the information at this time is obtained corresponds to t 2 . Moreover, since the vertical direction of the vertical direction is longer than the value of t 1 in the vertical direction and the horizontal direction, it is consistent with the theory that human eyes are more likely to enter the field of view than in the vertical direction.

5 Conclusion

As a future work, in the eye gaze measurement, the Eye tracker was calibrated for each subject at the beginning of the experiment, but a deviation of about several millimeters was confirmed at Unity’s operation confirmation. In the future, improvement of the experimental environmental equipment can be mentioned to improve the accuracy. Although the gaze point coordinates of the object to be gripped have been experimented in this time on the movement of the line of sight in grasping the object, it is also possible to move the line of sight when not grasping, or to a more distinctive object such as bilateral asymmetry It is necessary to compare motion of gaze when compared with the result of this experiment.