^{*}

This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution license (

Two feet motion is estimated for gait analysis. An inertial sensor is attached on each shoe and an inertial navigation algorithm is used to estimate the movement of both feet. To correct inter-shoe position error, a camera is installed on the right shoe and infrared LEDs are installed on the left shoe. The proposed system gives key gait analysis parameters such as step length, stride length, foot angle and walking speed. Also it gives three dimensional trajectories of two feet for gait analysis.

Gait analysis is the systematic study of human walking motion [

There are mainly two kinds of systems for gait analysis: outside observation systems and wearable sensor systems. In outside observation systems, a camera [

Various wearable sensors [

Recently inertial sensors have received lots of attention as wearable sensors for gait analysis. There are two types of inertial sensor-based systems. In [

Inertial navigation algorithm-based foot motion analysis is in most cases [

In [

In this paper, we propose inertial-sensor based two feet motion tracking system for gait analysis. An inertial sensor unit is installed on each shoe. The position and attitude between two shoes are estimated using a camera on one shoe and infrared LEDs on the other shoe. Using the proposed system, two feet motion (position, velocity and attitude) can be estimated. We note that only the inter-shoe distance (scalar quantity) is measured in [

The picture of the proposed system is given in

As we can see in

Five coordinate systems are used in the paper (see

A vector ^{3} expressed in the “A” coordinate system is sometimes denoted by [_{A}_{A}_{1}, _{2},

Let [_{1}]_{n}^{3} and [_{2}]_{n}^{3} be the origins of the body 1 and 2 coordinate systems, respectively. [_{1}]_{n}_{2}]_{n}_{1}]_{n}_{2}]_{n}_{1}]_{n}_{2}]_{n}_{1}]_{n}_{2}]_{n}_{1}]_{n}_{2}]_{n}

In _{c}_{b}_{1} ∈ ^{3} denotes the origin of the camera coordinate system in the body 1 coordinate system and [_{l}_{b}_{2} denotes the origin of the LED coordinate system in the body 2 coordinate system. Note that _{c}_{l}

[_{l}_{b}_{1} and [_{l}_{c}_{l}_{l}_{l}_{l}

From the vector relationship in

The origin of the LED coordinate system can be expressed in the navigation coordinate system as follows:

Inserting

In this section, an inertial navigation algorithm to estimate two feet motion is given. In Sections 3.1 and 3.2, a basic inertial navigation algorithm using an indirect Kalman filter is given. In Sections 3.3 and 3.4, measurement equations for the Kalman filter are given. In Section 3.5, an implementation issue of the proposed algorithm is discussed.

Let _{1} and _{2} be estimates of _{1} and _{2}. In this paper, we use the inertial navigation algorithm in [_{1} since the algorithm for _{2} is exactly the same.

Let _{1} ∈ ^{3} be the velocity of the right foot and _{1} ∈ ^{4} be the quaternion representing the rotation between the navigation and body 1 coordinate system. It is standard [_{1}, _{1} and _{1} satisfy the following:
_{b}_{1} ∈ ^{3} is the angular rates of the body 1 coordinate system with respect to the navigation coordinate system and _{b}_{1} is the external acceleration acting on IMU 1. For a vector _{x} ω_{z} ω_{y}^{3}, Ω(

Angular rates _{b}_{1} and external acceleration _{b}_{1} are measured using gyroscopes and accelerometers in IMU 1. Let _{g,}_{1} ∈ ^{3} and _{a,}_{1} ∈ ^{3} be gyroscope and accelerometer outputs of IMU 1, then _{g,}_{1} and _{a,}_{1} are given by
^{3} is the earth's gravitational vector and _{g,}_{1} ∈ ^{3} is the gyroscope bias. Measurement noises _{g,}_{1} ∈ ^{3} and _{a,}_{1} ∈ ^{3} are assumed to be white Gaussian noises whose covariances are given by _{g,}_{1} and _{a,}_{1}, respectively.

Inserting _{0} _{1} _{2} _{3}]′ is defined by

Variables for the left foot (_{2}, _{2}, _{b}_{2}, _{b}_{2}, _{g,}_{2}, _{a,}_{2}, _{g,}_{2}, _{g,}_{2} and _{a,}_{2}) are defined with the same way as in the right foot. _{2} can be computed using

Mainly due to measurement noises, _{1}, _{1} and _{1} (position, velocity and attitude estimates of the right foot) have some errors. These errors are estimated using a Kalman filter. This kind of Kalman filters is called an indirect Kalman filter since errors in _{1}, _{1} and _{1} are estimated instead of directly estimating _{1}, _{1} and _{1}.

Let _{e,}_{1}, _{e,}_{1}, _{e,}_{1} and _{e,}_{1} be errors in _{1}, _{1}, _{1} and _{g,}_{1}, which are defined by
_{e,}_{1} can be approximated as follows:
_{e,}_{1}.

The multiplicative attitude error term _{e,}_{1} in

For the left foot, _{e,}_{2}, _{e,}_{2}, _{e,}_{2} and _{e,}_{2} can be defined similarly. If we combine the left and right foot variables, the state of a Kalman filter is defined by

The state space equation for one foot is a standard inertial navigation algorithm and is given in [_{b,}_{1} and _{b,}_{2} are introduced to represent a slow change in the bias terms. In the definition of _{1} _{2} _{3}]′ ∈ ^{3} is defined by

There are two measurement equations for the state

This section explains how the vision data is used in the Kalman filter.

There are eight infrared LEDs on the left foot as in

The typical infrared LED images during walking are given in

Let the coordinates of the LEDs in the LED coordinate system be [led_{i}_{l}^{3} (1 ≤ _{i} υ_{i}^{2} be the image coordinates of eight LEDs on the normalized image plane, which are obtained by applying the camera calibration parameters [_{i}_{l}_{i} υ_{i}_{i}_{l}_{i}_{l}_{i}_{l}

To use

Let the estimated value of _{l}_{l}_{vision}_{l}

Inserting

Assuming _{e,}_{1} and _{vision}

The left hand side of _{vision}^{3} and is used as a measurement equation in the Kalman filter:
_{vision}

Whenever the camera on the right foot captures the LEDs on the left foot,

During normal walking, a foot touches the floor almost periodically for a short interval. During this short interval, the velocity of a foot is zero and this interval is called a “zero velocity interval”.

The zero velocity interval is detected using accelerometers and gyroscopes [

We assume that a person is walking on a flat floor. Thus, the _{1,}_{floor}

The measurement equation for the zero velocity interval of the left foot is given by
_{2,}_{floor}_{1,}_{floor}

Here the implementation of the indirect Kalman filter is briefly explained. Detailed explanation for a similar problem can be found in [_{1},_{k}_{1}(

The procedure to estimate _{1,}_{k}_{1,}_{k}_{1,}_{k}_{2,}_{k}_{2,}_{k}_{2,}_{k}

_{1,}_{k}_{1,}_{k}_{1,}_{k}_{2,}_{k}_{2,}_{k}_{1,}_{k}

The time update step [

The measurement update step using

Using _{1,}_{k}_{1,}_{k}_{1,}_{k}_{g,}_{1} are updated as follows:

Similarly, _{2,}_{k}_{2,}_{k}_{2,}_{k}_{g,}_{2} are updated.

After the update,

The discrete time index

In

For the left foot, the measurement data are available in the area around (a) (zero velocity update) and (c)–(d) (vision data update). When the measurement data are not available, the motion estimation depends on double integration of acceleration, whose errors tend to increase quickly even for a short time. To get a smooth motion trajectory, a forward-backward smoother (Section 8.5 in [

A smoother algorithm is applied for each walking step separately on the left and right foot movement. For example, consider the left foot movement between (a) and (e). After computing the forward Kalman filter (that is, a filter in Section 3.2) up to the point (e), the backward Kalman filter is computed from (e) to (a) with the final value of the forward Kalman filter as an initial value. Since the final value of the forward filter is used in the backward filter, the forward and the backward filter become correlated. Thus the smoother is not optimal. However, we found that the smoothed output is good enough for our application.

Note that _{2,}_{k}_{2,}_{b,}_{k}_{2,}_{k}_{2,}_{b,}_{k}_{2,}_{f,}_{k}_{2,}_{b,}_{k}_{2,}_{f,}_{k}_{2,}_{b,}_{f}_{1,} _{2}]. Consider one walking step from (a) to (e) in _{2,}_{s,k}_{2,}_{k}_{1}) and _{2,}_{s,}_{k}_{2,}_{b,}_{k}

A smoother algorithm can be applied to the velocity and attitude similarly.

To verify the proposed system, a person walked on the floor and the two feet motion was estimated using the proposed algorithm. The estimated two feet trajectory on the

In the time domain, the relationship between zero velocity intervals and vision data available intervals is given in

Three dimensional trajectories are given in

In addition to trajectories, attitude and velocity are also available from the inertial navigation algorithm. For example, estimated attitude (in Euler angles) of the left foot is given in

Thus we can obtain key gait analysis parameters such as step length, stride length, foot angle and walking speed using the proposed system.

Now the accuracy of the proposed system is evaluated. First, we test the accuracy of the vision-based position estimation, which is used to estimate the vector between two feet. The left shoe is located on different positions of the grid while the right shoe is located on the fixed position. The estimated left shoe position with respect to the right shoe is compared with the true value, which can be obtained from the grid. The result is given in

The next task is to evaluate the accuracy of the trajectories. A person walked on the long white paper with marker pens attached on both shoes. Marker pens are attached on shoes so that dots are marked on the white paper whenever a foot touches the floor. Marked dot positions are measured with a ruler and these values are considered as true values. The estimation positions during zero velocity intervals (when one foot is on the floor) are compared with marked dots. One step result is given in

A person walked 33 steps and the errors between the estimated positions and the marked positions are given in

Using inertial sensors on shoes, two feet motion is estimated using an inertial navigation algorithm. When two feet motion is estimated, it is necessary to measure the relative position between the two feet. In the proposed system, a vision system is used to measure the relative position and attitude between two feet.

Using the proposed system, we can obtain quantitative gait analysis parameters such as step length, stride length, foot angle and walking speed. Also we can see three dimensional trajectories of the two feet, which give qualitative information for gait analysis.

The accuracy of the proposed system is evaluated by measuring the position of a foot when a foot touches the floor. The mean position error is 1.2–2.5 cm and the maximum position error is 5.4 cm. For gait analysis, we believe the error is in an acceptable range.

The main contribution of the proposed system is that two feet motion can be observed at any place as long as the floor is flat. In commercial motion tracking using a camera such as Vicon, a dedicated experiment space is required. Thus we believe natural walking patterns can be observed using the proposed system.

This work was supported by the 2013 Research Fund of University of Ulsan.

The authors declare no conflict of interest. References

Picture of the proposed system.

Five coordinate systems (indicated coordinate axes are top-view).

Eight infrared LED configuration.

Infrared LED images during walking.

Typical two feet movement in the navigation coordinate system.

Estimated two feet trajectories on the

Zero velocity intervals and vision data available intervals.

Estimated trajectories in the three dimensional space.

Estimated attitude of the left foot (Euler angles).

Vision-based position estimation accuracy experiment results in the body 1 coordinate system.

One walking step estimation accuracy.

Step length estimation error.

Estimated Step length.