Next Article in Journal
Assessment of Taste Attributes of Peanut Meal Enzymatic-Hydrolysis Hydrolysates Using an Electronic Tongue
Next Article in Special Issue
Estimation of Joint Forces and Moments for the In-Run and Take-Off in Ski Jumping Based on Measurements with Wearable Inertial Sensors
Previous Article in Journal
Temperature-Compensated Force/Pressure Sensor Based on Multi-Walled Carbon Nanotube Epoxy Composites
Previous Article in Special Issue
Comparison of sEMG-Based Feature Extraction and Motion Classification Methods for Upper-Limb Movement
Open AccessArticle

Gait Measurement System for the Multi-Target Stepping Task Using a Laser Range Sensor

1
School of Science for Open and Environmental Systems, Graduate School of Science and Technology, Keio University, 3-14-1 Hiyoshi, Kohoku-ku, Yokohama 223-8522, Japan
2
Department of Physical Therapy, Human Health Sciences, Graduate School of Medicine, Kyoto University, 53 Kawahara-cho, Shogoin, Sakyo-ku, Kyoto 606-8507, Japan
3
Graduate School of Comprehensive Human Sciences, University of Tsukuba, 3-29-1 Otsuka, Bunkyo-ku, Tokyo 112-0012, Japan
4
Research & Development Division, Murata Machinery, Ltd., 136 Takeda-Mukaishiro-cho, Fushimi-ku, Kyoto 612-8686, Japan
5
Department of System Design Engineering, Keio University, 3-14-1 Hiyoshi, Kohoku-ku, Yokohama 223-8522, Japan
*
Author to whom correspondence should be addressed.
Academic Editor: Oliver Amft
Sensors 2015, 15(5), 11151-11168; https://doi.org/10.3390/s150511151
Received: 9 March 2015 / Revised: 2 May 2015 / Accepted: 8 May 2015 / Published: 13 May 2015
(This article belongs to the Special Issue Sensor Systems for Motion Capture and Interpretation)

Abstract

For the prevention of falling in the elderly, gait training has been proposed using tasks such as the multi-target stepping task (MTST), in which participants step on assigned colored targets. This study presents a gait measurement system using a laser range sensor for the MTST to evaluate the risk of falling. The system tracks both legs and measures general walking parameters such as stride length and walking speed. Additionally, it judges whether the participant steps on the assigned colored targets and detects cross steps to evaluate cognitive function. However, situations in which one leg is hidden from the sensor or the legs are close occur and are likely to lead to losing track of the legs or false tracking. To solve these problems, we propose a novel leg detection method with five observed leg patterns and global nearest neighbor-based data association with a variable validation region based on the state of each leg. In addition, methods to judge target steps and detect cross steps based on leg trajectory are proposed. From the experimental results with the elderly, it is confirmed that the proposed system can improve leg-tracking performance, judge target steps and detect cross steps with high accuracy.
Keywords: gait measurement; laser range sensor; Kalman filter; data association gait measurement; laser range sensor; Kalman filter; data association

1. Introduction

Falling is a leading cause of unintentional injury and death in the elderly [1,2] and can also result in impaired mobility, disability, fear of falling and reduced quality of life [3,4,5]. Unsurprisingly, the prevention of falls in the elderly is a public health priority in many countries across the world [6,7,8]. Falling is a common problem in the growing elderly population and there is a need for effective and convenient fall risk assessment tools that can be used in community-based fall prevention programs. Falling occurs in various situations of daily life and generally results from an interaction of multiple and diverse risk factors [1,2,9,10]. Recently, it has been reported that elderly people at high risk of falling show decreases in dual-task performance, i.e., in performing motor and cognitive tasks simultaneously [11,12,13,14]. To prevent falling in the elderly, gait training tasks have been proposed that enhance both motor and cognitive function. One example is the multi-target stepping task (MTST), shown in Figure 1, in which participants step on assigned colored targets arranged randomly on a mat [15]. The MTST evaluates motor function based on the stride length of each leg and the walking speed. Additionally, the MTST judges whether the participant steps on the assigned colored targets (target step judgment) and detects any cross steps (cross step detection) to evaluate cognitive function. The cross step is a behavior where the swinging leg crosses against the supporting leg as shown in Figure 1b. It has been reported that the proportion of missteps on the assigned colored target of high-risk elderly is higher than that of low-risk elderly in the MTST. Moreover, it has been confirmed that cross steps are likely to be seen during a turn when high-risk elderly people perform the MTST [16]. This gait training task requires a gait measurement system to quantitatively measure these parameters for the evaluation of the participant’s dual-task performance capability. To measure these walking parameters and evaluate the risk of falling using the MTST, a measurement system that can measure the foot contact time and position across several meters is required. Furthermore, it is desirable to measure not only the foot contact positions but also the trajectory of both legs during the swing phase.
In many cases, force plates [12,17] or three-dimensional motion measuring devices [18,19] have been used to measure walking parameters such as stride length and walking speed with high reliability. Force plates can assess dynamic balance function and foot contact time and position. However, to measure walking parameters in a several-meter walking test such as the MTST, the measurement system must be configured with several force plates, which is expensive. Three-dimensional motion measuring devices such as the VICON system can capture and analyze the motion of participants with high accuracy. However, the scale of the whole system is larger than the measurement range because of the range of the sensor (IR camera). In addition, it is necessary to attach markers to the participants to capture and analyze their gait. In actual community health centers [20], a non-contact measurement system is desirable because it is necessary to assess many participants in a short time.
Figure 1. (a) An appearance of the multi-target stepping task and proposed gait measurement system; (b) Cross step.
Figure 1. (a) An appearance of the multi-target stepping task and proposed gait measurement system; (b) Cross step.
Sensors 15 11151 g001
In terms of their cost, scale and convenience, it is difficult to install these devices in community health centers. Therefore, since the measurement of the effects of this training is carried out by observation in actual community health centers, it is difficult to quantitatively evaluate the capability of the participants.
To overcome these problems, an ultrasonic sensor, a laser range sensor (LRS) [21] or a RGB-Depth sensor such as the Microsoft Kinect [22] can be used. These devices are comparatively small and inexpensive devices. Several methods of tracking people’s center of gravity using these devices have been proposed [23,24,25,26,27,28,29]. To measure the walking parameters, the system has to track both legs and obtain their positions. A method used to track both legs and measure walking parameters based on the two-dimensional distance data from an LRS has been proposed and verified in straight walking tests [30,31]. Several methods to obtain the posture of a pedestrian based on the RGB-Depth data have also been proposed [32,33,34]. However, in gait training, to avoid the risk of falling for some participants during the MTST, a nursing attendant walks alongside the participant and the participant uses a stick if they use one normally. Additionally, both legs could be close to each other because of a narrow stride, or one leg might be hidden from the sensor owing to the increased number of cross steps in the high-risk elderly. These situations are likely to lead to false tracking or loss of leg tracking entirely. A method to detect and track the legs based on the RGB-Depth data even in cluttered environments has been proposed [35]. To measure walking parameters in several-meter walking tests such as the MTST, the sensor must be able to obtain high accuracy distance data over a wide range. Moreover, to assess the fall risk of elderly people during the MTST, methods to judge target steps and detect cross steps are required.
In this study, we develop a gait measurement system using a laser range sensor (LRS) [21] as shown in Figure 1a. The LRS is a comparatively small and inexpensive device and can obtain high accuracy two-dimensional distance data over a wide range. To reduce the number of occurrences of lost tracking of legs and of false tracking, we propose a novel leg detection method with five observed leg patterns and global nearest neighbor (GNN)-based [36,37] data association with a variable validation region based on the state of each leg. In addition, we propose methods to judge target steps and detect cross steps based on the trajectory of the legs. Comparing the experimental results of the MTST with the video analysis, we confirmed that the proposed system can improve leg-tracking performance in the elderly, judge target steps and detect cross steps. We also confirmed the validity of walking parameters such as foot contact time and position obtained by the proposed system from the results of the target step judgment.

2. Gait Measurement System

2.1. Configuration

As shown in Figure 1a, the system consists of an LRS, a personal computer, and two calibration poles. In the system, the LRS is installed at shin height (0.27 m in our system) and captures distance data by scanning a single laser beam in a horizontal plane. The personal computer acquires data from the LRS and calculates the leg positions.

2.2. Algorithm

As shown in Figure 2, the system has two main processes. The first process is leg detection and tracking. The positions of the legs are calculated based on the proposed leg patterns from LRS scan data. In the proposed system, tracking of the legs is carried out based on a Kalman filter. In addition, the data association (one-to-one matching of a tracked leg and an observed position with an LRS) has been implemented for reliable tracking [37]. In the data association, a validation region is used for eliminating unlikely observation-to-track associations [23]. A validation region is constructed around the predicted position. In this study, GNN-based [36,37] data association with a variable validation region based on the state of each leg is proposed. The second process is extraction of the walking performance parameters of the MTST (foot contact time and position, target step judgment and cross step detection) based on the trajectory of the legs.
Figure 2. Algorithm of the gait measurement system using an LRS.
Figure 2. Algorithm of the gait measurement system using an LRS.
Sensors 15 11151 g002
Before walking measurement, the system measures the leg width wl of the participant at shin height shown in Figure 3 and aligns the mat and LRS using two poles in reference [38].
Figure 3. Leg detection using five observed leg patterns; (a) SL pattern; (b) LT pattern; (c) FS_O pattern; (d) FS_U pattern; (e) UO pattern.
Figure 3. Leg detection using five observed leg patterns; (a) SL pattern; (b) LT pattern; (c) FS_O pattern; (d) FS_U pattern; (e) UO pattern.
Sensors 15 11151 g003

2.3. Leg Detection

This study presents a novel leg detection method to calculate observed leg positions based on the leg width wl and five observed leg patterns. To calculate the leg positions, the system searches for edges e m h (m = 1, …, Mk) from the LRS scan data using the following equation:
| l i l i + 1 | > w l / 2
where li is the i-th laser-scanned distance data from the right of an LRS. Moreover, the detected edges are identified by e m B = i, e m+1 F = i + 1 when li > li+1, and e m F = i, e m+1 B = i + 1 when li < li+1 (h = F, B, where F and B indicate the forward and backward edges, respectively). Mk is the total number of detected edges at time step k. As shown in Figure 3, the system calculates the observed leg positions y k i (j = 1, …, J) considering five observed leg patterns based on their spatial relationship and the width we between the edges. The five observed leg patterns are SL (Single Leg), LT (Legs Together), FS_O (Forward Straddle Observable), FS_U (Forward Straddle Unobservable) and UO (Unobservable).
SL is a pattern in which one leg is fully observable by the sensor alone, and is detected as a sequence of edges { e n B , e n+1 F , e n+2 F , e n+3 B }, with a width condition of 0.2wl < we ≤ 1.5wl. As shown in Figure 3a, the observed position of the leg is calculated based on wl.
LT is a pattern in which two legs are fully observable side by side by the sensor, and are detected as a sequence of edges { e n B , e n+1 F , e n+2 F , e n+3 B } or { e n F , e n+1 B , e n+2 F , e n+3 B } or { e n B , e n+1 F , e n+2 B , e n+3 F }, with a width condition of 1.5wl < we < 3.0wl. As shown in Figure 3b, the observed positions are calculated assuming that those two legs are side by side.
FS_O is a pattern in which one leg is observed as a stepped shape by the sensor owing to the influence of the other leg or a stick, and is detected as a sequence of edges { e n F , e n+1 B , e n+2 F , e n+3 B } or { e n B , e n+1 F , e n+2 B , e n+3 F }, with a width condition of 0.5wlwe < 1.5wl. As shown in Figure 3c, the observed position is calculated in the same way as in the SL pattern.
FS_U is a pattern that has a similar situation to FS_O, where the leg is again detected as a sequence of edges { e n F , e n+1 B , e n+2 F , e n+3 B } or { e n B , e n+1 F , e n+2 B , e n+3 F }, with a width condition 0.2wl < we < 0.5wl. However, the position of the leg cannot be directly calculated. Thus, as shown in Figure 3d, the observed position is calculated virtually based on the leg width wl.
UO is a pattern in which one leg is unobservable because of occlusion. In particular, even if the leg is not fully observable by the sensor, by calculating the position of the tracked leg in the FS_U pattern, improvements of the estimation accuracy and tracking performance can be expected.

2.4. Leg Tracking

This study presents a novel leg tracking method using a Kalman filter and GNN-based data association with a variable validation region based on the state of each leg. If the sampling time ∆t (0.05 s in our system) is sufficiently shorter than the gait cycle time, we assume that the change in velocity at the next time step is not very large. The discrete time model of leg motion is given as follows:
x k f =A x k1 f +BΔ x k1 f   ( f=L,R )
where A=[ 1 0 Δt 0 0 1 0 Δt 0 0 1 0 0 0 0 1 ],     B=[ Δ t 2 /2 0 0 Δ t 2 /2 Δt 0 0 Δt ] , and x k f = [ x k f y k f x ˙ k f y ˙ k f ] T . ( x k f ,   y k f ):= p k f is the estimated position and ( x ˙ k f ,   y ˙ k f ):= v k f is the estimated velocity of the leg (f = L, R, where L and R indicate the left and right legs, respectively). Δ x k f = [ n x ˙ k n y ˙ k ] T is the acceleration disturbance vector, which is assumed to be zero mean and has a white noise sequence with variance Q. We set the variance as Q=diag[ (5.0) 2 ,   (5.0) 2 ] considering that the leg speed is accelerated and decelerated 0.0 to 2.5 m/s in the swing phase (about 1.0 s) in the experiments. The LRS obtains the leg position from y k f = [ x k f y k f ] T . The measurement model is as follows:
y k f = C x k f + w k
where C=[ 1 0 0 0 0 1 0 0 ] . w k = [ n x k n y k ] T is the measurement noise, which is assumed to be zero mean and has a white noise sequence with variance R. In our experiments, we set the variance as R=diag[ ( w l /2) 2 ,   ( w l /2) 2 ] considering that the LRS measures the distance within the error and the observed leg position is calculated from the leg width wl.

2.4.1. Prediction

As shown in Figure 4a, based on the model of leg motion, the system predicts the position of the tracked leg by:
y ^ k/k1 f =C x ^ k/k1 f =CA x ^ k1/k1 f
where x ^ k/k1 f and x ^ k1/k1 f are the a priori state estimation at time step k and the a posteriori state estimation at time step k1 .
Figure 4. Leg tracking using validation regions considering the state of each leg; (a) Prediction; (b) Data association; (c) Correction.
Figure 4. Leg tracking using validation regions considering the state of each leg; (a) Prediction; (b) Data association; (c) Correction.
Sensors 15 11151 g004

2.4.2. Data Association

As shown in Figure 4b, a validation region is constructed around the predicted position to eliminate unlikely observation-to-track associations. The j-th (j = 1, …, J) observed position y k j is included in the validation region of the predicted position y ^ k/k1 f of the tracked leg according to:
y k j y ^ k/k1 f < r val f
where r val f (f = L, R) is the radius of the validation region. The measurement accuracy changes in accordance with the velocity of the leg and whether the leg is moving while hidden. In these situations, losing track of the leg or false tracking of another leg or a stick is likely to occur. To solve these problems, the radius of the validation region is designed considering the state of each leg: gait phase (whether the leg is in the stance phase or swing phase), the speed, and times when the leg is unobservable. The radius of validation region considering these points is shown in Table 1. H k f is the number of times that no observed positions are included in the validation region (observed leg pattern is UO) continuously at time step k. vsw and vst are the assumed speed of the leg in the swing and stance phases while it is hidden. In our experiments, vsw and vst are respectively set to 1.1 m/s and 0.55 m/s considering that the average human walking speed is about 1.1 m/s.
Table 1. Setting of the radius of validation region r val f considering the state of each leg.
Table 1. Setting of the radius of validation region r val f considering the state of each leg.
Gait phase at k − 1ObservableUnobservable
Stance phase 3 4 w l +Δt   v k1 f 3 4 w l +Δt H k f v st
Swing phase w l +Δt   v k1 f w l +Δt   v k1 f +Δt H k f v sw
Then, the following cost matrix D is defined for observation-to-track associations:
D=[ d L,1 d L,2 d L,J d R,1 d R,2 d R,J ]
The element d f,j of the cost matrix is the matching cost between the predicted position y ^ k/k1 f of the tracked leg and j-th observed position y i k and has the following values:
d f,j = { λ f,j  if  y k j  is in the validation region of  y ^ k / k 1 f else
λ f,j is the Mahalanobis distance and is calculated as follows:
λ f,j = ( y k j y ^ k/k1 f ) T ( S k f ) -1 ( y k j y ^ k/k1 f )
where S k f is the covariance of the innovation ( y k j y ^ k/k1 f ) . The data association is achieved so that the summed total distance of D can be minimized [36].

2.4.3. Correction

Finally, as shown in Figure 4c, based on the result of the data association, the state estimation vector is updated using the Kalman filter. If there are no corresponding observed positions in the validation region, the predicted position y ^ k/k1 f is used as an observed position and the observed leg pattern is assumed to be UO.

2.4.4. Gait Phase Identification

From validation compared with a force plate [38], it is possible to identify the phase of gait (stance phase or swing phase) considering the speed of both legs in human walking. The condition that the right leg is in the stance phase is:
v k R < v k L          v k R < v st_th
The condition that the right leg is in the swing phase is:
v k R > v k L          v k R > v sw_th
where v st_th and v sw_th are the thresholds of the maximum speed in the stance phase and the minimum speed in the swing phase, respectively. In our experiments, v st_th and v sw_th are respectively set to one sixth (0.18 m/s) and one third (0.37 m/s) of the average human walking speed (1.1 m/s). The gait phase of the left leg is identified in the same way.
With the proposed data association method, we can expect that the chances of losing a tracked leg will be reduced even if the velocity of the leg changes suddenly. We can also expect that the chances of false tracking of other observed objects such as another leg or a stick will be reduced because it is difficult for other objects to be included. In addition, the variable validation region is also effective even when the leg is moving while hidden from the sensor.

2.5. Walking Parameters Extraction

2.5.1. Foot Contact Position Extraction

In this study, the foot contact time is defined as the time when the bottom of the foot is attached to the floor and the leg is perpendicular to the floor. As shown in Figure 5, the speed of the leg at shin height scanned by LRS is at a minimum value during the stance phase. Therefore, the foot contact time is extracted as the time when the leg speed is at a minimum value in the stance phase. In addition, the foot contact position can be acquired as the estimated position at shin height at the foot contact time because the leg is almost perpendicular to the floor.
Figure 5. Image of the gait speed diagram during walking.
Figure 5. Image of the gait speed diagram during walking.
Sensors 15 11151 g005

2.5.2. Target Step Judgment

Figure 6a shows the examples of the observed leg position when the participant stepped around the target (target size is 0.160 m × 0.165 m). From the experimental results and the leg model based on the average value of the physical data shown in Figure 6b, to judge whether the participant stepped on the assigned target, the region of the target step judgment was designed as shown in Figure 6c. The system judged that the participant stepped on the assigned colored target if the foot contact position was included in this region.
Figure 6. Target step judgment; (a) Examples of the results of observed leg position; (b) Leg model; (c) Region of the target step judgment.
Figure 6. Target step judgment; (a) Examples of the results of observed leg position; (b) Leg model; (c) Region of the target step judgment.
Sensors 15 11151 g006

2.5.3. Cross Step Detection

As shown in Figure 7, from preliminary experimental data with the elderly, the characteristic relationship between the trajectory of the swinging leg and the foot contact position of the supporting leg was confirmed when the participant performed a cross step. This study presents a method of detecting cross steps based on this relationship.
Figure 7. Cross step detection.
Figure 7. Cross step detection.
Sensors 15 11151 g007
As shown in Figure 7, an x y coordinate system whose origin was the previous foot contact position of the swinging leg (right leg in this case) was defined. In this coordinate system, the foot contact position of the supporting leg (left leg in this case) was defined as p st L = [ x L y L ] T , and the k sw ( =1,, K sw ) -th position from the previous foot contact position of the swinging leg was p k sw R = [ x k sw R y k sw R ] T . K sw indicates the number of samples in the swing phase. Then, the system detected a cross step if these parameters satisfied the following condition:
y L < w c          y k sw R < w c    ( k sw =1,, K sw )
where wc is the threshold of cross step detection. We determined that wc = wl/2 from the experimental results. Cross step detection was performed for the left leg in the same way. The system performed the above processing in every foot contact position and recorded the number and position of the detected cross steps.

3. Experiments

3.1. Participants and Environment

Sixteen elderly volunteers (eleven men, five women, mean age 78.1 ± 8.7 years), including two elderly people using a stick, were recruited as participants for this study. None of them had any indications of the following symptoms: serious visual impairment, inability to ambulate independently, symptomatic cardiovascular disease, or severe arthritis. Informed consent was obtained from all volunteers prior to participation, in accordance with the guidelines approved by the Kyoto University Graduate School of Medicine (approval number E-880) and the Declaration of Human Rights, Helsinki, 1975.
Table 2 shows the specification of the LRS (UTM-30LX, Hokuyo Automatic Co., Ltd., Osaka, Japan [21]). The sampling time of the system ∆t was set to 0.05 s. The MTST mat size was 5.85 m long by 1.15 m wide, and three colored (red, blue and white) targets (0.160 m × 0.165 m) were arranged randomly on it. As shown in Figure 1a, participants walked from the start position to the goal position stepping on the assigned colored targets three times (three colors). To avoid the risk of falling during the MTST, a nursing attendant walked alongside the participant.
Table 2. Specifications of the UTM-30LX LRS ([21]).
Table 2. Specifications of the UTM-30LX LRS ([21]).
Detection Range0.1–30 m, max. 60 m
270°
Measurement Accuracy0.1–10 m: ±0.03 m
10–30 m: ±0.05 m
Angular Resolution0.25°(360°/1440)

3.2. Verification of Leg Tracking

To verify the effectiveness of the proposed leg tracking method, three conventional methods labelled 1 to 3 (see Table 3 for definitions) were used. In Method 1, conventional leg detection method excluding the FS_U pattern [27] was used. In Methods 2 and 3, the proposed leg detection method using the FS_U pattern was used. We set a large fixed validation region for each method considering the observation error and the moving distance at one sampling time point in the swing phase without prediction. In Methods 1 and 2, a radius of the large fixed validation region ( r val f = 3 2 w l ) was used. The average leg width wl at shin height is about 0.1 m. We assumed that the observation error was wl/2 and that the moving distance was wl considering that the leg speed in the swing phase was twice the average human walking speed (1.1 m/s) and that the sampling time was 0.05 s in this system. In Method 3, a radius of the small fixed validation region ( r val f = wl) was used. We assumed that that the observation error was wl/2 and the moving distance was wl/2 considering that the leg speed in the stance phase was the same as the average human walking speed.
Figure 8 and Figure 9 show example leg-tracking results in those situations that are likely to lead to false tracking or losing track of the legs. In addition, Table 3 shows all 48 gait measurement results of 16 elderly people walking.
Figure 8 shows an example of the LRS data and gait measurement results in a situation where the right leg was temporarily hidden. As shown in Figure 8, the right leg was hidden by the left leg at time t = 23.80. In Method 1, which excluded the FS_U pattern for leg detection, the estimated position deviated significantly at time t = 23.85 because an accurate observed position could not be obtained at time t = 23.80. Therefore, the system lost track of the right leg. In Method 2, used the FS_U pattern for leg detection, even if the leg was not fully observable at time t = 23.80, by calculating the position of the tracked leg in the FS_U pattern, the system could obtain an accurate estimated position at time t = 23.85. The system could therefore keep track of the right leg.
Figure 8. Example of leg tracking results in a situation where the right leg of the participant was temporarily hidden; (a) Method 1: conventional leg detection excluding the FS_U pattern; (b) Method 2: the proposed leg detection using the FS_U pattern.
Figure 8. Example of leg tracking results in a situation where the right leg of the participant was temporarily hidden; (a) Method 1: conventional leg detection excluding the FS_U pattern; (b) Method 2: the proposed leg detection using the FS_U pattern.
Sensors 15 11151 g008
Figure 9 shows an example of the LRS data and gait measurement results in a situation in which both legs were close together. As shown in Figure 9, in both data associations of Method 2 and the proposed method, the observed position of the right leg was disconnected from the validation region of the right leg at time t = 7.55.
In Method 2 with a large fixed validation region ( r val f = 3 2 w l ) , the right validation region included the observed position of the left leg and the left validation region included the observed positions of the left and right leg. In this situation, false tracking by switching the left and right legs occurred with the GNN algorithm. To avoid switching of the legs in these situations, the validation region should be set smaller. However, as shown in Table 3, in Method 3 with a small fixed validation region ( r val f = wl), losing track of the leg is likely to occur when the velocity of the leg changes suddenly or the leg is moving while hidden from the sensor. In the proposed method with variable validation regions based on the state of each leg, the observed position of the right leg was disconnected from the small validation region of the left leg because the left leg was in the stance phase at time t = 7.55. The right validation region was expanded because the corresponding observed position did not exist within it, then the system detected the right leg at time t = 7.65.
Figure 9. Example of leg tracking results in a situation where both of the participant’s legs were close together; (a) Method 2: the radius of the large fixed validation region was used; (b) Proposed method: the radius of the validation region was changed depending on the state of each leg.
Figure 9. Example of leg tracking results in a situation where both of the participant’s legs were close together; (a) Method 2: the radius of the large fixed validation region was used; (b) Proposed method: the radius of the validation region was changed depending on the state of each leg.
Sensors 15 11151 g009
As shown in Table 3, it was confirmed that the proposed leg detection and data association method can reduce the number of occurrences of lost tracking of legs and of false tracking.
Table 3. Leg tracking results for each method.
Table 3. Leg tracking results for each method.
Five Observed Leg PatternsRadius of the Validation Region r val f Participants not Using a Stick (14 people, 42 Trials)Participants Using a Stick (2 People, 6 Trials)Total (16 People, 48 Trials)
Number of Lost TracksNumber of False TracksNumber of Lost TracksNumber of False TracksSuccess Rate
Method 1No 3 2 w l 653070.8% (34/48)
Method 2Yes 3 2 w l 110291.7% (44/48)
Method 3Yes w l 1401264.6% (31/44)
ProposedYesVariable000295.8% (46/48)

3.3. Verification of Walking Parameters Extraction

To verify the validity of the target step judgment and cross step detection of the proposed system, we recorded performance on the MTST using video cameras and compared our results with those obtained using video analysis. Additionally, we verified the validity of the foot contact time and position using the results of the target step judgment.
Table 4 shows the results of target step judgment and cross step detection from 46 successful tracking data series compared with video analysis. Figure 10 shows an example of leg trajectory results. In Figure 10, if the system judged a target step, a large “O” symbol was displayed at the foot contact position. If the system detected a cross step, a large “+” symbol was displayed at the foot contact position. As shown in Figure 10 and Table 4, it was confirmed that the proposed system could judge target steps with very high accuracy (success rate: 99.0%), even including the participants using a stick. Additionally, the validity of the foot contact time and position obtained by the proposed system because of the high accuracy of the target step judgment was confirmed. We also confirmed that the proposed system could detect cross steps with high accuracy (success rate: 78.9%).
Table 4. Results of target step judgment and cross step detection.
Table 4. Results of target step judgment and cross step detection.
Participants not Using a Stick (381 Steps, 16 Cross Steps)Participants Using a Stick (33 steps, Three Cross Steps)Total (414 Steps, 19 Cross Steps)
Number of Non-Judgments and Non-DetectionsNumber of Misjudgments and False DetectionsNumber of Non-Judgments and Non-DetectionsNumber of Misjudgments and False DetectionsSuccess Rate of Judgment and DetectionRate of Misjudgment and False Detection
Target step judgment430399.0% (410/414)1.4% (6/414)
Cross step detection202078.9% (15/19)0.0% (0/15)
Figure 10. Example of gait measurement results.
Figure 10. Example of gait measurement results.
Sensors 15 11151 g010

4. Conclusions

This study presents a gait measurement system using a LRS for the MTST to evaluate the risk of falling. The system is advantageous over current systems for the MTST in terms of cost, scale and convenience of use. When elderly people at high risk of falling perform the MTST, situations in which one leg is hidden from the sensor or the legs are close occur and are likely to lead to losing track of the legs or to false tracking. To solve these problems, we proposed a novel leg detection method with five observed leg patterns and GNN-based data association with a variable validation region based on the state of each leg. In addition, we proposed methods to judge whether the participant steps on the assigned colored targets (target step judgment) and detect a behavior where the swinging leg crosses against the supporting leg (cross step detection) based on the trajectory of both legs.
To verify the validity of the proposed gait measurement system, we carried out the MTST with 16 elderly people, including two elderly people using a stick. Comparing the experimental results with video analysis, we confirmed that the proposed system could improve leg-tracking performance, judge target steps and detect cross steps. We also confirmed the validity of the foot contact time and position obtained by the proposed system from the results of the target step judgment. This gait measurement system may be helpful in assessing fall risk indicators in the elderly in community health centers.

Acknowledgments

This work was supported by Grant-in-Aid for Japan Society for the Promotion of Science (JSPS) Fellows Grant Number 25-5707 and JSPS KAKENHI Grant Number 25709015. We would like to thank the members of the Department of Human Health Sciences at Kyoto University for their help with data collection.

Author Contributions

A.Y. was involved in the design and development of the measurement system, data acquisition, analysis and interpretation, and drafted the manuscript. S.N. and M.Y. conducted the tests and were involved in experimental planning, data acquisition and analysis and revision of the manuscript. T.A. was involved in the concept, design and coordination of the study and revision of the manuscript. T.M. was involved in the concept, design and coordination of the study, data acquisition and revision of the manuscript. M.T. was involved in the concept, design and coordination of the study, development of the measurement system, data acquisition and revision of the manuscript. All the authors have read and approved the final version of the manuscript.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Deandrea, S.; Bravi, F.; Turati, F.; Lucenteforte, E.; La, V.C.; Negri, E. Risk factors for falls in older people in nursing homes and hospitals. A systematic review and meta-analysis. Arch. Gerontol. Geriatr. 2013, 56, 407–415. [Google Scholar] [CrossRef] [PubMed]
  2. Deandrea, S.; Lucenteforte, E.; Bravi, F.; Foschi, R.; La, V.C.; Negri, E. Risk factors for falls in community-dwelling older people: A systematic review and meta-analysis. Epidemiology 2010, 21, 658–668. [Google Scholar] [CrossRef] [PubMed]
  3. Da Costa, B.R.; Rutjes, A.W.; Mendy, A.; Freund-Heritage, R.; Vieira, E.R. Can Falls Risk Prediction Tools Correctly Identify Fall-prone Elderly Rehabilitation Inpatients? A Systematic Review and Meta-Analysis. PLoS ONE 2012, 7. [Google Scholar] [CrossRef] [PubMed]
  4. Gillespie, L.D.; Robertson, M.C.; Gillespie, W.J.; Sherrington, C.; Gates, S.; Clemson, L.M.; Lamb, S.E. Interventions for preventing falls in older people living in the community. Cochrane Database Syst. Rev. 2012, 9. [Google Scholar] [CrossRef]
  5. Leveille, S.G.; Jones, R.N.; Kiely, D.K.; Hausdorff, J.M.; Shmerling, R.H.; Guralnik, J.M.; Kiel, D.P.; Lipsitz, L.A.; Bean, J.F. Chronic musculoskeletal pain and the occurrence of falls in an older population. J. Am. Med. Assoc. 2009, 302, 2214–2221. [Google Scholar] [CrossRef]
  6. World Health Organization. WHO Global Report on Falls Prevention in Older Age; WHO Press: Geneva, Switzerland, 2008. [Google Scholar]
  7. Hauer, K.; Lamb, S.E.; Jorstad, E.C.; Todd, C.; Becker, C. Systematic review of definitions and methods of measuring falls in randomized controlled fall prevention trials. Age Aging 2006, 35, 1–5. [Google Scholar] [CrossRef]
  8. Karlsson, M.K.; Magnusson, H.; von Schewelov, T.; Rosengren, B.E. Prevention of falls in the elderly—A review. Osteoporos. Int. 2013, 24, 747–762. [Google Scholar] [CrossRef] [PubMed]
  9. American Geriatrics Society; British Geriatrics Society; American Academy of Orthopaedic Surgeons Panel on Falls Prevention. Guideline for the prevention of falls in older persons. J. Am. Geriatr. Soc. 2001, 49, 664–672. [Google Scholar]
  10. Stubbs, B.; Binnekade, T.; Eggermont, L.; Sepehry, A.A.; Patchay, S.; Schofield, P. Pain and the risk for falls in community-dwelling older adults: Systematic review and meta-analysis. Arch. Phys. Med. Rehabil. 2014, 95, 175–187. [Google Scholar] [CrossRef] [PubMed]
  11. Melzer, I.; Oddsson, L.I. The effect of a cognitive task on voluntary step execution in healthy elderly and young individuals. J. Am. Geriatr. Soc. 2004, 52, 1255–1262. [Google Scholar] [CrossRef] [PubMed]
  12. Melzer, I.; Shtilman, I.; Rosenblatt, N.; Oddsson, L.I. Reliability of voluntary step execution behavior under single and dual task conditions. J. NeuroEng. Rehabil. 2007, 4. [Google Scholar] [CrossRef]
  13. Yamada, M.; Aoyama, T.; Arai, H.; Nagai, K.; Tanaka, B.; Uemura, K.; Mori, S.; Ichihashi, N. Dual-task walk is a reliable predictor of falls in robust elderly adults. J. Am. Geriatr. Soc. 2011, 59, 163–164. [Google Scholar] [CrossRef] [PubMed]
  14. Schoene, D.; Lord, S.R.; Delbaere, K.; Severino, C.; Davies, T.A.; Smith, S.T. A randomized controlled pilot study of home-based step training in older people using videogame technology. PLoS ONE 2013, 8. [Google Scholar] [CrossRef] [PubMed]
  15. Yamada, M.; Higuchi, T.; Tanaka, B.; Uemura, K.; Nagai, K.; Aoyama, T.; Ichihashi, N. Measurements of Stepping Accuracy in a Multitarget Stepping Task as a Potential Indicator of Fall Risk in Elderly Individuals. J. Gerontol. Ser. A Biol. Sci. Med. Sci. 2011, 66A, 994–1000. [Google Scholar] [CrossRef]
  16. Yamada, M.; Higuchi, T.; Mori, S.; Uemura, K.; Nagai, K.; Aoyama, T.; Ichihashi, N. Maladaptive turning and gaze behavior induces impaired stepping on multiple footfall targets during gait in older individuals who are at high risk of falling. Arch. Gerontol. Geriatr. 2012, 54, 102–108. [Google Scholar] [CrossRef] [PubMed]
  17. Kistler Instruments Ltd. Available online: http://www.kistler.com/us/en/index (accessed on 12 February 2014).
  18. Davis, R.B.; Õunpuu, S.; Tyburski, D.; Gage, J.R. A gait analysis data collection and reduction technique. J. Hum. Mov. Sci. 1991, 10, 575–587. [Google Scholar] [CrossRef]
  19. Vicon Motion Systems Ltd. Available online: http://vicon.com (accessed on 12 February 2014).
  20. Nippon Shooter Ltd. Available online: http://www.nippon-shooter.co.jp/prod/lifecare/dayservice/movie03.html (accessed on 12 February 2014). (In Japanese)
  21. Hokuyo Automatic Co., Ltd. Available online: http://www.hokuyo-aut.jp (accessed on 12 February 2014).
  22. Microsoft. Kinect for Windows Sensor Components and Specifications, Microsoft Developer Network. Available online: http://msdn.microsoft.com/en-us/library/jj131033.aspx (accessed on 18 April 2015).
  23. Ozaki, M.; Kakimuma, K.; Hashimoto, M.; Takahashi, K. Laser-Based Pedestrian Tracking in Outdoor Environments by Multiple Mobile Robots. Sensors 2012, 12, 14489–14507. [Google Scholar] [CrossRef]
  24. Lee, J.H.; Abe, K.; Tsubouchi, T.; Ichinose, R.; Hosoda, Y.; Ohba, K. Collision-Free Navigation Based on People Tracking Algorithm with Biped Walking Model. In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems, Nice, France, 22–26 September 2008; pp. 2983–2989.
  25. Schulz, D.; Burgard, W.; Fox, D.; Cremers, A.B. People Tracking with Mobile Robot using Sample-based Joint Probabilistic Data Association Filters. Int. J. Robot. Res. 2003, 22, 99–116. [Google Scholar] [CrossRef]
  26. Almeida, J.; Almeida, A.; Araujo, R. Tracking Multiple Moving Objects for Mobile Robotics Navigation. In Proceedings of the IEEE International Conference on Emerging Technologies and Factory Automation, Catania, Italy, 19–22 September 2005; pp. 203–210.
  27. Bellotto, N.; Hu, H. Multisensor-Based Human Detection and Tracking for Mobile Service Robots. IEEE Trans. Syst. Man Cybern. Part B Cybern. 2009, 39, 167–181. [Google Scholar] [CrossRef][Green Version]
  28. Sabatini, A.M.; Colla, V. A method for sonar based recognition of walking people. Robot. Auton. Syst. 1998, 25, 117–126. [Google Scholar] [CrossRef]
  29. Basso, F.; Munaro, M.; Michieletto, S.; Pagello, E.; Menegatti, E. Fast and Robust Multi-people Tracking from RGB-D Data for a Mobile Robot. In Proceedings of the 12th International Conference on Intelligent Autonomous Systems, Jeju Island, Korea, 26–29 June 2012.
  30. Pallejà, T.; Teixidò, M.; Tresanchez, M.; Palacìn, J. Measuring Gait Using a Ground Laser Range Sensor. Sensors 2009, 9, 9133–9146. [Google Scholar] [CrossRef] [PubMed]
  31. Teixidò, M.; Pallejà, T.; Tresanchez, M.; Noguès, M.; Palacìn, J. measuring Oscillating Walking Paths with a LIDAR. Sensors 2011, 11, 5071–5086. [Google Scholar] [CrossRef]
  32. Shotton, J.; Fitzgibbon, A.; Cook, M.; Sharp, T.; Finocchino, M.; Moore, R.; Kipman, A.; Blake, A. Real-time Human Pose Recognition in Parts from Single Depth Images. In Proceedings of the IEEE International Conference on Computer Vision and Pattern Recognition, Colorado Springs, CO, USA, 20–25 June 2011; pp. 1297–1304.
  33. Ratsamee, P.; Mae, Y.; Ohara, K.; Takubo, T.; Arai, T. People Tracking with Body Pose Estimation for Human Path Prediction. In Proceedings of the IEEE International Conference on Mechatronics and Automation, Chengdu, China, 5–8 August 2012; pp. 1915–1920.
  34. Auvinet, E.; Multon, F.; Meunier, J. New Lower-Limb Gait Asymmetry Indices Based on a Depth Camera. Sensors 2015, 15, 4605–4623. [Google Scholar] [CrossRef] [PubMed]
  35. Gritti, A.P.; Tarabini, O.; Guzzi, J.; di Caro, G.A.; Caglioti, V.; Gambardella, L.M.; Giusti, A. Kinect-Based People Detection and Tracking from Small-Footprint Ground Robots. In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems, Chicago, IL, USA, 14–18 September 2014; pp. 4096–4103.
  36. Konstantinova, P.; Udvarev, A.; Semerdjiev, T. A Study of a Target Tracking Algorithm Using Global Nearest Neighbor Approach. In Proceedings of the International Conference on Computer Systems and Technologies, Sofia, Bulgaria, 17–18 June 2003.
  37. Bar-Shalom, Y.; Willett, P.K.; Tian, X. Tracking and Data Fusion: A Handbook of Algorithms; YBS Publishing: Storrs, CT, 2011. [Google Scholar]
  38. Matsumura, T.; Moriguchi, T.; Yamada, M.; Uemura, K.; Nishiguchi, S.; Aoyama, T.; Takahashi, M. Development of measurement system for task oriented step tracking using laser range finder. J. NeuroEng. Rehabil. 2013, 10. [Google Scholar] [CrossRef]
Back to TopTop