Next Article in Journal
Ammonia Gas Sensor Based on Graphene Oxide-Coated Mach-Zehnder Interferometer with Hybrid Fiber Structure
Next Article in Special Issue
Pilots for Healthy and Active Ageing (PHArA-ON) Project: Definition of New Technological Solutions for Older People in Italian Pilot Sites Based on Elicited User Needs
Previous Article in Journal
Measuring User Experience, Usability and Interactivity of a Personalized Mobile Augmented Reality Training System
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

User Local Coordinate-Based Accompanying Robot for Human Natural Movement of Daily Life

1
Department of Physical Therapy and Assistive Technology, National Yang Ming Chiao Tung University, Taipei 11221, Taiwan
2
Center for General Education, National Yang Ming Chiao Tung University, Taipei 11221, Taiwan
3
Department of Physical Medicine and Rehabilitation, Taipei Veterans General Hospital, Taipei 11217, Taiwan
4
School of Medicine, National Yang-Ming University, Taipei 11221, Taiwan
*
Author to whom correspondence should be addressed.
Sensors 2021, 21(11), 3889; https://doi.org/10.3390/s21113889
Submission received: 5 May 2021 / Revised: 29 May 2021 / Accepted: 1 June 2021 / Published: 4 June 2021
(This article belongs to the Special Issue Assistive Robots for Healthcare and Human-Robot Interaction)

Abstract

:
Considering the trend of aging societies, accompanying technology can help frail, elderly individuals participate in daily activities. The ideal accompanying robot should accompany the user in a proper position according to the activity scenarios and context; the prerequisite is that the accompanying robot should quickly move to a designated position and closely maintain it regardless of the direction in which the user moves. This paper proposes a user local coordinate-based strategy to satisfy this need. As a proof of concept, a novel “string-pot” approach was utilized to measure the position difference between the robot and the target. We implemented the control strategy and assessed its performance in our gait lab. The results showed that the robot can follow the user in the designated position while the user performs forward, backward, and lateral movements, turning, and walking along a curve.

1. Introduction

Many accompanying robots have been developed to help people perform predefined marine, overground, and aeronautical tasks [1,2,3,4,5,6,7,8,9,10,11,12,13,14,15,16,17,18,19,20]. Considering the trend of aging societies, we aim to extend the accompanying technology to help frail, elderly individuals participate in daily activities.
To reduce the physical burden on elderly people, shopping carts, e.g., those proposed in [2], that can automatically follow behind the user have been developed. Other devices such as smart luggage [3,4] and golf carts [5] are available on the market to facilitate certain activities. However, the “following-behind” approach sometimes imposes a psychosocial impact on users, because they cannot see the device and must frequently check whether the device is lost [6,10,13].
To solve this psychosocial problem, researchers have begun to develop robots that are located in front of or to the side of the person while accompanying him or her [6,7,8,9,10]. Front-accompanying robots enable the user to check the robot’s location and access the robot with ease. However, an accompanying robot that is always situated in front would interfere with some of the user’s activities, such as fetching items from a shelf in a supermarket, opening doors, and interacting with friends. Therefore, a robot that can accompany the user in different positions is desirable for daily living activities.
In previous studies [2,7,8,9,14,18,19,20,21] on autonomous following, there was limited discussion on the user’s sideways walking or sharp turning [6,22], which are essential for daily activities. Moreover, it appears that when the user turns, the robot does not move around the user; instead, the user moves around the robot. The reasons include the limited degrees of freedom (DOFs) of the moving platform and the user’s movement tracking strategy. As Hu [7] noted, a user implicitly smoothly moves to cooperate with the accompanying robot movement. In other words, a user reduces the accustomed moving sideways or sharp turns to prevent the robot from failing to follow the user, which also reduces the usability of the accompanying robot in daily activities.
The ideal goal of an accompanying robot for daily living is that the robot can accompany the user in a proper position according to the activity scenarios and context and that the user can move naturally without worry. To achieve this goal, there are two levels of tasks. First, the position at which the robot should accompany the user must be determined. This step can be accomplished by explicit commands from the user [11,14], and the robot can be helped by other techniques such as obstacle avoidance, path planning [10,12,13], or artificial intelligence [15,16,17] in the future. The second and fundamental task is that the robot should quickly move to the designated position and closely maintain it regardless of the direction in which the user moves. This second task is the focus of this study.
The first prerequisite to closely following the user’s movements is that the robot must be as agile as the human, i.e., it should at least be able to move on a plane with 3 DOFs. Many designs, e.g., Mecanum wheels [23], have been proposed and are outside the scope of this study. The other important issue in closely following the user is the tracking method. For example, if the robot tracks the course of the user’s movement, it may not be able to determine whether the user is moving backward or turning around and subsequently moving forward. To date, a single method for a robot to follow a user in any designated position has not been developed. This paper proposes a user local coordinate-based method to satisfy this demand. To prove its concept, we implemented a system and assessed its performance in our gait lab.

2. Methods

2.1. Concept of the User Local Coordinate-Based Accompanying Method

The concept of the proposed accompanying method is that the robot should move to the specified position with respect to the user’s local coordinate system ( L C S U ). As shown in Figure 1a, the objective is to make R C (the current position and orientation of the robot relative to the user) approach R T (the target position and orientation relative to the user), i.e., to make the position difference Δ R M approach the zero vector as given in Equation (1). Its underlying tasks include determining the target position, obtaining the robot’s position, calculating the position difference between the target and the accompanying robot in L C S U , and moving the robot in the robot coordinate system ( L C S R ) to reduce the position difference. By minimizing the position difference, the robot can closely accompany the user in a designated position while the user moves naturally.
The pseudocode of this algorithm is listed as follows:
  • System Initialize
  • Setup and reset user’s coordinate system
  • Setup and reset robot’s coordinate system
  • Set target position Pt of the accompanying robot w.r.t the user
  • Measure the position Pc of the accompanying robot w.r.t the user
  • Calculate the errors Eu between Pc and Pt
  • Convert Eu to Er in the accompanying robot’s local coordinate system
  • Move the robot to reduce the error Er according to a control algorithm
  • Repeat steps 4~8 until system stops following
Because both local coordinate systems of the user ( O U ) and robot ( O R ) are moving and rotating, the coordinate transformation of the position difference between L C S U and L C S R must be clarified. In this study, it was assumed that the target position was predetermined; thus, the required position difference rested on determining the current position of the robot in L C S U . The transformation equation is derived as follows:
R T = O U O R T = X T Y T 1 R C = O U O R C = X C Y C 1
Objective: move O R C to O R T , i.e.,
Δ R M = R T R C = 0 0 1
where
O w : W o r l d   c o o r d i n a t e   s y s t e m   W C S O U : U s e r s   l o c a l   c o o r d i n a t e   s y s t e m   L C S U O R : R o b o t s   l o c a l   c o o r d i n a t e   s y s t e m   L C S R R T : R o b o t s   t a r g e t   p o s i t i o n   r e l a t i v e   t o   the   u s e r R C : R o b o t s   c u r r e n t   p o s i t i o n   r e l a t i v e   t o   the   u s e r Δ R M : P o s i t i o n   d i f f e r e n c e
For convenience, homogeneous coordinates are used. In general, the user is moving, so L C S U and R C in L C S U will change even when the robot is stationary. The three moving scenarios are discussed below.
The first scenario is that the user performs a translation without rotation, as shown in Figure 1b. Assuming the user translates from O U T 0 to O U T 1 , let O U T 0 O U T 1 = Δ X t Δ Y t in the world coordinate system W C S , and view the movement from the perspective of L C S U .
R C 1 = O U T 1 O R C 0 = R C 0 1 0 0 0 1 0 Δ X t Δ Y t 1 = X C 0 Y C 0 1 T Δ X t Δ Y t 1 T
The second scenario is that the user performs a rotation in place without translation, as shown in Figure 1c. O U T 0 O U T 1 rotates with α r in W C S and is viewed from L C S U .
R C 1 = O U T 1 O R C 1 = O U T 0 O R C 0   rotate   α r = R C 0 cos α r sin α r 0 sin α r cos α r 0 0 0 1 = X C 0 Y C 0 1 T cos α r sin α r 0 sin α r cos α r 0 0 0 1 = X C 0 cos α r + Y C 0 sin α r X C 0 sin α r + Y C 0 cos α r 1 T
L C S R is also rotated by α r .
The third scenario is that the user translates and rotates, as shown in Figure 1d; then,
R C 1 = R C 0 cos α r sin α r 0 sin α r cos α r 0 Δ X t Δ Y t 1 = X C 0 Y C 0 1 T cos α r sin α r 0 sin α r cos α r 0 Δ X t Δ Y t 1 = X C 0 Y C 0 1 T cos α r sin α r 0 sin α r cos α r 0 0 0 1 Δ X t Δ Y t 1 T
The third scenario is the general case. Combined with Equation (1), this leads to Δ R M U , and the position difference in L C S U is
Δ R M U = Δ X U Δ Y U 1 T = R T R C 1 = X T Y T 1 T X C 0 Y C 0 1 T cos α r sin α r 0 sin α r cos α r 0 0 0 1 Δ X t Δ Y t 1 T
L C S R is also rotated by α r .
In implementation, the robot moves in L C S R ; therefore, Δ R M U should further rotate with ( γ C 0 ) and yield Δ R M R as the position difference in L C S R .
Δ R M R = Δ X R Δ Y R 1 T = Δ X U Δ Y U 1 T cos γ C 0 sin γ C 0 0 sin γ C 0 cos γ C 0 0 0 0 1 = X T Y T 1 T cos γ C 0 sin γ C 0 0 sin γ C 0 cos γ C 0 0 0 0 1 X C 0 Y C 0 1 T cos α r + γ C 0 sin α r + γ C 0 0 sin α r + γ C 0 cos α r + γ C 0 0 0 0 1 + Δ X t Δ Y t 1 T cos α r + γ C 0 sin α r + γ C 0 0 sin α r + γ C 0 cos α r + γ C 0 0 0 0 1
With the error between the robot’s position and the target position, i.e., Equation   6 , and by applying a control law, e.g., PID, to make R M R 0 , the robot can accompany the user in any position, including in front, and in any movement direction, including forward, backward, sideways, rotating in place, etc. To maintain the orientation of the robot in L C S U , the robot should also rotate by α r .

2.2. Embodiment of the Accompanying Robot with the L C S U Viewpoint

To measure the position difference between the robot and the target, a novel “string-pot” approach was utilized. It is composed of two rotary encoders, one distance sensor, and a retractable conducting wire, as shown in Figure 2. The shafts of the two rotary encoders are parallel to each other and fixed on the user and the robot. The distance sensor is attached to the housing of the robot’s encoder. The retractable wire connects the housing of the encoders, pulls them to face each other, and ensures that the distance sensor always points to the user. In addition, the retractable conducting wire can transmit the encoder’s signal from the user to the robot. This arrangement can easily measure the orientations of L C S U and L C S R with respect to each other and the distance between them. With this measurement system, R c 1 is directly measured w.r.t. L C S U , and Equations (4) and (6) become Equations (7) and (8), respectively.
R C 1 = d · cos θ U d · sin θ U 1 T
Δ R M R = Δ X R Δ Y R 1 T = Δ X U Δ Y U 1 T cos θ U + θ R sin θ U + θ R 0 sin θ U + θ R cos θ U + θ R 0 0 0 1 = X T Y T 1 T d · cos θ U d · sin θ U 1 T   cos θ U + θ R sin θ U + θ R 0 sin θ U + θ R cos θ U + θ R 0 0 0 1
where
d :   d i s t a n c e   b e t w e e n   t h e   u s e r   a n d   t h e   r o b o t θ U   : e n c o d e r   v a l u e   o n   t h e   u s e r θ R   : e n c o d e r   v a l u e   o n   t h e   r o b o t
The measured encoder values are negative for the user orientation change; i.e.,
Δ R M R θ = θ U + θ R
where
Δ R M R θ   is   the   rotation   angle   of   L C S R to maintain the orientation w.r.t. L C S U
As mentioned, while performing daily activities, humans utilize all 3 DOFs, including moving forward, backward, and sideways and turning in place. To simplify the movement control, we utilized Mecanum wheels [23] for the 3-DOF moving platform.
The control system structure is shown in Figure 3. A PID controller was implemented, and Δ R M R was used as its error input. The output of the PID controller was the velocity command V x V y ω z for the robot to execute, as shown in Equation (9).
Δ R M R + Δ R M R θ = Δ x R Δ y R Δ θ R T PID U = V x V y ω z T
Since the four Mecanum wheels were used, the individual rotational speeds of the wheels could be calculated for the intended movement, and the pulse-width modulation (PWM) signals could be converted for their motor drivers using the pulse-width (PW) mapper as shown in Equation (10) [24].
PW   mapper : P W M 1 P W M 2 P W M 3 P W M 4 = k ω 1 ω 2 ω 3 ω 4 = k R   + 1 + 1 L 1 + L 2 + 1 1 L 1 + L 2 1 + 1 L 1 + L 2 1 1 L 1 + L 2 V x V y ω z
where
  • V x ,   V y ,   ω z : velocity of the robot
  • ω 1 ,   ω 2 ,   ω 3 , ω 4 : angular velocity of the four Mecanum wheels
  • R : radius of the Mecanum wheels
  • k : motor constant
  • P W M : pulse width modulation

2.3. Detailed System Hardware

The string-pot system in Figure 2 was realized with an infrared distance sensor (SHARP GP2Y0A02YK0F) and two absolute rotary encoders (P3015 Series Hall Rotary Encoder). Their signals were acquired by a programmable system-on-chip (PSoC) microcontroller (Cypress, CY8CKIT-059 PSoC 5 LP prototyping kit) and a built-in 12-bit analog-to-digital converter (ADC) with a sampling rate of 2000 samples per second (s/s). The data were filtered with a median filter and subsequently downsampled to 50 s/s to calculate the position difference Δ R M R . Then, the position difference was further converted to a pulse width through a PID controller and a PW mapper. Through the built-in PWM module of the PSoC, the pulse widths were transmitted to the motor drivers (DC 5–12 V 0–30 A Dual-channel H Bridge Motor Driver Controller) to drive the DC geared motor (GW4058-31ZY DC worm gear motor) and Mecanum wheels. The system structure block diagram was shown in Figure 4. The appearance of the tested accompanying robot was shown in Figure 5.

3. Methods of System Verification

3.1. Testing Tasks

To assess the performance of the proposed accompanying method, six basic walking tasks and two types of combined movement were performed. The six basic walking tasks were: forward walking, backward walking, left lateral walking, right lateral walking, a left pivot of 90°, and a right pivot of 90°. These tasks were performed with a 2.5 m × 1.8 m rectangle of the force plate border as a reference. The two types of combined maneuvers were (1) walking along the force plate while the user maintained a front-facing orientation and (2) walking around the force plate clockwise or counterclockwise. The border of the force plate was used as the walking reference. In addition, the user arbitrarily set or selected the target position. Although the robot’s performance should be identical with different accompanying positions, it was purposely set in front of the user to demonstrate our proposed method capability.

3.2. Testing Environment

The tests were performed in our motion lab, which was equipped with an 8-camera VICON motion capture system (VICON Motion Systems, Oxford, UK). Four markers were placed on the user’s pelvis and bilateral anterior and posterior iliac spine (ASIS and PSIS) to calculate the user’s orientation with respect to the lab’s coordinates; four markers were placed on the shelf above the four Mecanum wheels of the robot to calculate the orientation of the robot with respect to the lab’s coordinates. Two other markers were placed on the housing of the user-end and robot-end rotary encoders as the origins of L C S U and L C S R , respectively. The origins were not at the centers of the user and robot.
The data from the VICON system were processed to obtain the trajectory of the user and robot by LabVIEW to calculate the orientation of the user and robot, distance between the origins of L C S U and L C S R , and relative orientations of L C S R and L C S U with respect to each other. The data were also used to calculate the robot’s trajectory in L C S U .

4. Results of System Verification

Table 1 lists the positions and displacements of the user and robot in the W C S . The origin of the WCS was set at the user’s start position. The direction and orientation of the WCS were determined with respect to the lab coordinates. First, the user faced the positive Y axis. The robot’s start position was the setting position of the WCS. The displacement was the stop position subtracted from the start position. The movement displacements of the user and robot in the Y direction were 1914 mm and 1933 mm in the forward-walking task, respectively. The movement displacements of the user and robot in the Y direction were −1980 mm and −1985 mm in the backward-walking task, respectively. The displacements of the user and robot in the X direction were −1520 mm and −1732 mm in the left lateral walking task, respectively. The displacements of the user and robot in the X direction were 1558 mm and 1666 mm in the right lateral walking task, respectively. The turning angles of the user and robot changed by 73.7 degrees and 71.6 degrees in the left pivot turning task, respectively. The turning angles of the user and robot changed by −92.1 degrees and −88.4 degrees in the right pivot turning task, respectively.
Table 2 lists the robot’s start, target, and stop positions in L C S U . The stop position is defined as the point where the user stopped walking and the robot settled. X C is the robot position w.r.t. L C S U in the lateral direction, Y C is the robot position w.r.t. L C S U in the front direction, and γ C is the robot’s orientation w.r.t. L C S U .

4.1. Basic Walking Tasks

In the basic walking tasks, the user was asked to perform the tasks inside the rectangular force plate in the center of the motion lab. Figure 6 shows the trajectories of the walking tasks w.r.t. the WCS, where the corresponding origins of L C S U (symbolized ‘○’) and L C S R (symbolized ‘●’) are linked by line segments. The protruding hairline symbols represent medial-lateral directions. The interval between consecutive corresponding point pairs was 200 milliseconds. Lags and overshoots were observed at the beginning and end, respectively. The target/starting position was arbitrarily selected by the user at the beginning of each test. Figure 6a shows the result of the user walking forward 6 steps. Figure 6b shows the result of the user walking backward 8 steps. Figure 6c shows the result of the user performing left lateral walking for 8 steps. Figure 6d shows the result of the user performing right lateral walking for 8 steps.
Figure 7 shows the time series of the sensing values for the walking tasks that correspond to the distance between the robot and the user and the encoder values θ R o b o t , θ U s e r . The gaps between the set distance and actual distance and between θ U s e r and the target angle represent the lag and overshoot of the track. Lag tracking can be demonstrated in the increased gap between θ U s e r and θ R o b o t . The distance between the user and the robot decreased while the user walked forward and increased while the user walked backward due to the phase lag.

4.2. Combined Tasks

4.2.1. Walking along a Rectangle

In the condition of walking along a rectangle, the user was instructed to walk along the rectangular force plate in the center of the motion lab. The movements along the rectangle in this condition included walking forward, laterally right, backward, and laterally left with eight steps per direction. The results are presented in Figure 8. The accompanying robot could follow the user walking along the rectangle. Shown are the trajectories of the walking tasks where the corresponding origins of L C S U (symbolized ‘○’) and L C S R (symbolized ‘●’) are linked by line segments. The protruding hairline symbols represent the medial-lateral directions.

4.2.2. Clockwise-Curve Walking Test

In the clockwise-curve walking test, the user was instructed to walk in a circle around the force plate clockwise, which combined forward walking and small right lateral and right turning movements. As shown in Figure 9, the starting position of the robot was on the front right side of the user, and the trajectory was almost on the user’s walking curve. Shown are the trajectories of the walking tasks where the corresponding origins of L C S U (symbolized ‘○’) and L C S R (symbolized ‘●’) are linked by line segments. The protruding hairline symbols represent medial-lateral directions. The total walking steps were 23 steps in this clockwise-curve walking test.

4.2.3. Counterclockwise-Curve Walking Test

In the counterclockwise-curve walking test, the user was instructed to walk in a circle around the force plate counterclockwise, which combined forward, left lateral, and left turning movements. As shown in Figure 10, the starting position of the robot was on the front right side of the user. When the user walked counterclockwise, the trajectory of the robot enclosed the trajectory of the user. Shown are the trajectories of the walking tasks where the corresponding origins of L C S U (symbolized ‘○’) and L C S R (symbolized ‘●’) are linked by line segments. The protruding hairline symbols represent the medial-lateral directions. There were 28 walking steps in total in this counterclockwise-curve walking test.

5. Discussion

The results show that the robot can follow the user in the designated position while the user performs forward, backward, and lateral movements, turning, and curve walking. In other words, this study has demonstrated that with the proposed tracking strategy, the accompanying robot can closely follow the user in sideways or sharp turn movements. In addition, it can accompany the user in front even when the user moves backwards.
Each test had different start/target positions, which were arbitrarily set. Regardless, the difference between the end position and the set/target position in each test was within an acceptable range for daily accompanying. There were phase lags and overshoots at the end of the tests. Further tuning the PID controller or replacing it with other control algorithms could improve the tracking performance. For example, there appeared to be a small phase lag and no overshoot in the left lateral walking test (Figure 6c and Table 1). After checking the data, we found that the robot was further to the right of the user, and the distance sensor measured the user’s forearm instead of the trunk when setting the target position. In other words, unlike the other tests, in the left lateral walking test, the target distance was shorter than the actual distance between the robot and the user from the beginning. This mistake suggests a control method where changing the target distance may reduce the phase lag and overshoot, i.e., when the user is approaching the robot, e.g., the robot is in front of the user and the user is walking forward, the target distance can be set to a greater value; when the user is moving away from the robot, the target distance can be set shorter. Further control techniques, including system identification, modeling, and simulation, to improve system performance and stability should be examined in the future.
In this study, the selected distance sensor (SHARP GP2Y0A02YK0F) is less affected by environmental light. However, the sensing range was limited from 20 cm to 150 cm, and some measuring errors may occur when the user walks too close to the robot. The absolute rotary encoder (P3015 series hall rotary encoder) is less affected by friction. However, the retractable conducting wire may swing, and the user can rotate his/her body while moving. Consequently, the sensor may not always measure the same point of the user. The trajectory figures (Figure 6, Figure 8, Figure 9 and Figure 10) show that there were natural rotations and sideways movements of the user’s pelvis; thus, the user’s coordinates were not steady, which caused fluctuating target positions in the WCS. There were three possible solutions to remove unnecessary fluctuations in the robot: (1) use a laser range finder; (2) reduce the weight of the retractable wire or become wireless; and (3) reduce the loop rate of the controller. However, a low loop rate can compromise the trajectory tracking performance. Adding a movement filter to smoothen the motion may be better.
In addition to the controller and transducers, the mechanical design of the robot affects the tracking performance. First, the inertia and driving torque of the robot affect the acceleration. Actuator saturation is experienced when the user moves too fast. It can be solved by using high-torque motors or by an algorithm [25]. Driving motors with higher torque can respond more quickly to the user’s sudden movements, but consume more power and increase system weight. Second, the 3 DOFs of the moving platform simplify the control strategy and save time and space in tracking the target position. In this study, this aspect was implemented with Mecanum wheels [23], but other biomimetic techniques [21,26] or unmanned aerial vehicles [22,27] can also be used.
The key point of the proposal was determining the positions of the user and robot w.r.t to each other. We achieved this task using the novel “string-pot” sensor system. It is compact, reliable, and immune to environmental noise such as light. However, with this “string-pot” sensor system, the user-side sensor is not suitable to be worn on the lateral side of the user’s body because the sensor and wire will interfere with the user’s hand. To remove the wire, i.e., create a wireless system, further challenges must be overcome, such as target user identification and user orientation determination. Facial recognition [28] may solve part of this problem, but issues such as the user looking around without changing the body’s position while the robot is following behind must be considered. Obtaining the relative distances and orientations of the user and robot without physical wires is possible. Two inertial measurement units (IMUs) [18,19], mounted on both user side and robot side, and a laser range finder [20] can provide information on the distance and orientation between the robot and the user.
The excursion trajectories of the robot are greatly affected by the target position and user movement tracking. As shown in Figure 6e,f, the robot took a longer path while the user pivoted. Furthermore, as shown in Figure 9 and Figure 10, the user performed similar lengths of circular walks, but the robot trajectories were different. One of the robot’s trajectories was similar to the user’s trajectory, and the other of the robot’s trajectories was larger than the user’s trajectory. The reason was that one of the robot’s target positions was on the walking curve/circle, and the other was outside of the walking curve/circle. If the target point is outside of the instance center’s circle of the user’s moving curve, the robot is expected to take a longer path than the user.

6. Conclusions

We have demonstrated that a robot with one algorithm and simple controller can accompany the user in a selected position while the user performs forward, backward, and sideways or sharp turn movements. i.e., the user can move naturally, and the robot can maintain a designed position w.r.t. the user. This goal was achieved using three main factors: (1) knowledge of the positions of the user and robot w.r.t their local coordinate systems; (2) agile movement of the robot; and (3) a quickly updating feedback loop. In the future, using wireless sensors that combine the obstacle avoidance technique and a high-level method to determine the set position of the robot in L C S U , the accompanying robot can further help the user in daily life activities.

Author Contributions

Conceptualization, H.-K.W. and C.-H.Y.; methodology, H.-K.W. and C.-H.Y.; software, H.-K.W.; validation, H.-K.W. and P.-Y.C.; formal analysis, H.-K.W., P.-Y.C. and H.-Y.W.; investigation, H.-K.W., P.-Y.C. and H.-Y.W.; resources, C.-H.Y.; data curation, H.-K.W., P.-Y.C. and H.-Y.W.; writing—original draft preparation, H.-K.W.; writing—review and editing, C.-H.Y.; visualization, H.-K.W.; supervision, C.-H.Y.; project administration, H.-K.W.; funding acquisition, C.-H.Y. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the Ministry of Science and Technology, Taiwan, grant numbers MOST 106-2218-E-010-001-, MOST 107-2218-E-010-001-, and MOST 109-2224-E-182-001-.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Publicly available datasets were analyzed in this study. This data can be found here: https://www.dropbox.com/s/jh1omfkuiuwf48j/VICON_data.zip?dl=0, accessed on 4 June 2021.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Islam, M.J.; Hong, J.; Sattar, J. Person-following by autonomous robots: A categorical overview. Int. J. Robot. Res. 2019, 38, 1581–1618. [Google Scholar] [CrossRef] [Green Version]
  2. Nishimura, S.; Takemura, H.; Mizoguchi, H. Development of attachable modules for robotizing daily items -Person following shopping cart robot. In Proceedings of the 2007 IEEE International Conference on Robotics and Biomimetics (ROBIO), Sanya, China, 15–18 December 2008. [Google Scholar]
  3. Travelmate Robotics. Available online: https://travelmaterobotics.com/ (accessed on 20 May 2021).
  4. Piaggio Fast Forward. Available online: https://www.piaggiofastforward.com/ (accessed on 19 May 2021).
  5. CaddyTrek. Available online: https://caddytrek.com/ (accessed on 20 May 2021).
  6. Jung, E.-J.; Yi, B.-J.; Yuta, S. Control algorithms for a mobile robot tracking a human in front. In Proceedings of the 2012 IEEE/RSJ International Conference on Intelligent Robots and Systems, Vilamoura-Algarve, Portugal, 7–12 October 2012. [Google Scholar]
  7. Hu, J.-S.; Wang, J.-J.; Ho, D. Design of Sensing System and Anticipative Behavior for Human Following of Mobile Robots. IEEE Trans. Ind. Electron. 2014, 61, 1916–1927. [Google Scholar] [CrossRef] [Green Version]
  8. Ferrer, G.; Garrell, A.; Sanfeliu, A. Robot companion: A social-force based approach with human awareness-navigation in crowded environments. In Proceedings of the 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems, Tokyo, Japan, 3–7 November 2013. [Google Scholar]
  9. Mi, W.; Wang, X.; Ren, P.; Hou, C. A system for an anticipative front human following robot. In Proceedings of the International Conference on Artificial Intelligence and Robotics and the International Conference on Automation, Control and Robotics Engineering, Lisbon, Portugal, 29–31 July 2016. [Google Scholar]
  10. Nikdel, P.; Shrestha, R.; Vaughan, R. The hands-free push-cart: Autonomous following in front by predicting user trajectory around obstacles. In Proceedings of the 2018 IEEE International Conference on Robotics and Automation (ICRA), Brisbane, Australia, 21–25 May 2018. [Google Scholar]
  11. Fritsch, J.; Kleinehagenbrock, M.; Lang, S.; Fink, G.; Sagerer, G. Audiovisual person tracking with a mobile robot. 2004. Available online: https://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.9.7741&rep=rep1&type=pdf (accessed on 4 June 2020).
  12. Huskić, G.; Buck, S.; González, L.A.I.; Zell, A. Outdoor person following at higher speeds using a skid-steered mobile robot. In Proceedings of the 2017 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Vancouver, BC, Canada, 24–28 September 2017. [Google Scholar]
  13. Triebel, R.; Arras, K.; Alami, R.; Beyer, L.; Breuers, S.; Chatila, R.; Chetouani, M.; Cremers, D.; Evers, V.; Fiore, M.; et al. SPENCER: A Socially Aware Service Robot for Passenger Guidance and Help in Busy Airports. In Field and Service Robotics; Springer: Cham, Switzerland, 2016; pp. 607–622. [Google Scholar]
  14. Jevtic, A.; Doisy, G.; Parmet, Y.; Edan, Y. Comparison of interaction modalities for mobile indoor robot guidance: Direct physical interaction, person following, and pointing control. IEEE Trans. Hum.-Mach. Syst. 2015, 45, 653–663. [Google Scholar] [CrossRef]
  15. Chen, B.X.; Sahdev, R.; Tsotsos, J. Integrating Stereo Vision with a CNN Tracker for a Person-Following Robot. In International Conference on Computer Vision Systems; Springer: Cham, Switzerland, 2017. [Google Scholar]
  16. Jiang, S.; Yao, W.; Hong, Z.; Li, L.; Su, C.; Kuc, T.-Y. A Classification-Lock Tracking Strategy Allowing a Person-Following Robot to Operate in a Complicated Indoor Environment. Sensors 2018, 18, 3903. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  17. Wang, X.; Zhang, L.; Wang, D.; Hu, X. Person detection, tracking and following using stereo camera. In Proceedings of the Ninth International Conference on Graphic and Image Processing (ICGIP 2017), Qingdao, China, 14–16 October 2017. [Google Scholar] [CrossRef]
  18. Cifuentes, C.A.; Frizera, A.; Carelli, R.; Bastos, T. Human–robot interaction based on wearable IMU sensor and laser range finder. Robot. Auton. Syst. 2014, 62, 1425–1439. [Google Scholar] [CrossRef]
  19. Brookshire, J. Person Following Using Histograms of Oriented Gradients. Int. J. Soc. Robot. 2010, 2, 137–146. [Google Scholar] [CrossRef]
  20. Chung, W.; Kim, H.; Yoo, Y.; Moon, C.; Park, J. The Detection and Following of Human Legs Through Inductive Approaches for a Mobile Robot With a Single Laser Range Finder. IEEE Trans. Ind. Electron. 2012, 59, 3156–3166. [Google Scholar] [CrossRef]
  21. Dürr, V.; Arena, P.P.; Cruse, H.; Dallmann, C.J.; Drimus, A.; Hoinville, T.; Krause, T.; Mátéfi-Tempfli, S.; Paskarbeit, J.; Patanè, L.; et al. Integrative Biomimetics of Autonomous Hexapedal Locomotion. Front. Neurorobotics 2019, 13, 88. [Google Scholar] [CrossRef] [PubMed]
  22. Xiao, X.; Fan, Y.; Dufek, J.; Murphy, R. Indoor UAV Localization Using a Tether. In Proceedings of the 2018 IEEE International Symposium on Safety, Security, and Rescue Robotics (SSRR), Philadelphia, PA, USA, 6–8 August 2018. [Google Scholar]
  23. Dickerson, S.L.; Lapin, B.D. Control of an omni-directional robotic vehicle with Mecanum wheels. In Proceedings of the NTC ’91—National Telesystems Conference Proceedings, Atlanta, GA, USA, 26–27 March 1991. [Google Scholar]
  24. Taheri, H.; Qiao, B.; Ghaeminezhad, N. Kinematic model of a four mecanum wheeled mobile robot. Int. J. Comput. Appl. 2015, 113, 6–9. [Google Scholar] [CrossRef]
  25. Su, Y.; Zheng, C.; Mercorelli, P. Global Finite-Time Stabilization of Planar Linear Systems With Actuator Saturation. IEEE Trans. Circuits Syst. II Express Briefs 2017, 64, 947–951. [Google Scholar] [CrossRef]
  26. Delcomyn, F.; Nelson, M.E. Architectures for a biomimetic hexapod robot. Robot. Auton. Syst. 2000, 30, 5–15. [Google Scholar] [CrossRef]
  27. Chakrabarty, A.; Morris, R.; Bouyssounouse, X.; Hunt, R. Autonomous indoor object tracking with the Parrot AR. Drone. In Proceedings of the 2016 International Conference on Unmanned Aircraft Systems (ICUAS), Arlington, VA, USA, 7–10 June 2016. [Google Scholar]
  28. Fukui, K.; Yamaguchi, O. Face recognition using multi-viewpoint patterns for robot vision. In Robotics Research. The Eleventh International Symposium; Springer: Berlin/Heidelberg, Germany, 2005. [Google Scholar]
Figure 1. The movements of the accompanying robot and the user. (a) the robot translates and rotates to the target position; (b) the user translates without rotation; (c) the user rotates in place without translation; (d) the user translates and rotates.
Figure 1. The movements of the accompanying robot and the user. (a) the robot translates and rotates to the target position; (b) the user translates without rotation; (c) the user rotates in place without translation; (d) the user translates and rotates.
Sensors 21 03889 g001
Figure 2. “String-pots” to sense L C S U and L C S R .
Figure 2. “String-pots” to sense L C S U and L C S R .
Sensors 21 03889 g002
Figure 3. Control system structure.
Figure 3. Control system structure.
Sensors 21 03889 g003
Figure 4. System structure block diagram.
Figure 4. System structure block diagram.
Sensors 21 03889 g004
Figure 5. Assembly of the tested accompanying robot.
Figure 5. Assembly of the tested accompanying robot.
Sensors 21 03889 g005
Figure 6. Top-view motion graph of the movements of the robot and user w.r.t. the WCS. (a) Forward walking test; (b) backward walking test; (c) left lateral walking test; (d) right lateral walking test; (e) walking test with pivot turning to the left; (f) walking test with pivot turning to the right.
Figure 6. Top-view motion graph of the movements of the robot and user w.r.t. the WCS. (a) Forward walking test; (b) backward walking test; (c) left lateral walking test; (d) right lateral walking test; (e) walking test with pivot turning to the left; (f) walking test with pivot turning to the right.
Sensors 21 03889 g006
Figure 7. Relative distance between the robot and the user w.r.t. the LCS and encoder values θ R o b o t , θ U s e r in the LCS. (a) Forward walking test; (b) backward walking test; (c) left lateral walking test; (d) right lateral walking test; (e) walking test with pivot turning to the left; (f) walking test with pivot turning to the right.
Figure 7. Relative distance between the robot and the user w.r.t. the LCS and encoder values θ R o b o t , θ U s e r in the LCS. (a) Forward walking test; (b) backward walking test; (c) left lateral walking test; (d) right lateral walking test; (e) walking test with pivot turning to the left; (f) walking test with pivot turning to the right.
Sensors 21 03889 g007
Figure 8. Top-view motion graph of walking along the rectangle.
Figure 8. Top-view motion graph of walking along the rectangle.
Sensors 21 03889 g008
Figure 9. Top-view motion graph of clockwise walking.
Figure 9. Top-view motion graph of clockwise walking.
Sensors 21 03889 g009
Figure 10. Top-view motion graph of counterclockwise walking.
Figure 10. Top-view motion graph of counterclockwise walking.
Sensors 21 03889 g010
Table 1. Result of Walking Tasks in the W C S .
Table 1. Result of Walking Tasks in the W C S .
TestsStart PositionStop PositionDisplacement
X   mm Y   mm γ   ° X   mm Y   mm γ   ° X   mm Y   mm γ   °
Forward walking
user0099.6−3191491.8−31914−7.8
robot2443878.190236177.9661933−0.2
Backward walking
user0092.428−198092.828−19800.4
robot8845178.882−153481.9−6−19853.1
Left lateral walking
user0097.4−15206492.0−152064−5.4
robot22058395.7−1512582108.4−1732−212.7
Right lateral walking
user00921558−3490.41558−34−1.8
robot−844783.8165836377.91666−84−5.9
Left pivot turning
user0097.2−265−222170.9−266−22273.7
robot1041795.9−796−279167.5−807−69671.6
Right pivot turning
user0095.5309−1653.4309−165−92.1
robot3458898.6871−20910.1837−797−88.4
Table 2. Result of Walking Tasks in L C S U .
Table 2. Result of Walking Tasks in L C S U .
TestsTarget/Start PositionStop PositionAverage Position during Walking
X C
(mm)
Y C
(mm)
γ C
(°)
X C
(mm)
Y C
(mm)
γ C
(°)
X C
(mm)
Y C
(mm)
γ C
(°)
Forward walking95418−12.5107443−13.6111 ± 33320 ± 40−19.3 ± 6.5
Backward walking106447−13.275443−9.7128 ± 32574 ± 44−12.5 ± 3.1
Left lateral walking292550−28.026517−2.9259 ± 31482 ± 36−28.3 ± 3.3
Right lateral walking9447−0.7101397−14.3−159 ± 50422 ± 1320.7 ± 6.5
Left pivot turning63412−8.8−95331.0310 ± 167440 ± 26−32.6 ± 14.9
Right pivot turning91582−8.8−365633.7−106 ± 189536 ± 5411.5 ± 19.7
Walking along a rectangle−12138417.5−9837614.6−115 ± 125323 ± 8519.8 ± 20.3
Clockwise-curve walking42541−4.473514−8.011 ± 72347 ± 53−1.8 ± 11.6
Counterclockwise-curve walking−515305.5−14749216.614 ± 241315 ± 74−1.6 ± 37.8
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Wu, H.-K.; Chen, P.-Y.; Wu, H.-Y.; Yu, C.-H. User Local Coordinate-Based Accompanying Robot for Human Natural Movement of Daily Life. Sensors 2021, 21, 3889. https://doi.org/10.3390/s21113889

AMA Style

Wu H-K, Chen P-Y, Wu H-Y, Yu C-H. User Local Coordinate-Based Accompanying Robot for Human Natural Movement of Daily Life. Sensors. 2021; 21(11):3889. https://doi.org/10.3390/s21113889

Chicago/Turabian Style

Wu, Hsiao-Kuan, Po-Yin Chen, Hong-Yi Wu, and Chung-Huang Yu. 2021. "User Local Coordinate-Based Accompanying Robot for Human Natural Movement of Daily Life" Sensors 21, no. 11: 3889. https://doi.org/10.3390/s21113889

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop