Real-time imitation enables a humanoid robot to mirror the behavior of humans, being important for applications of human–robot interaction. For imitation, the corresponding joint angles of the humanoid robot should be estimated. Generally, a humanoid robot comprises dozens of joints that construct a high-dimensional exploration space for estimating the joint angles. Although a particle filter can estimate the robot state and provides a solution for estimating joint angles, the computational cost becomes prohibitive given the high dimension of the exploration space. Furthermore, a particle filter can only estimate the joint angles accurately using a motion model. To realize accurate joint angle estimation at low computational cost, Gaussian process dynamical models (GPDMs) can be adopted. Specifically, a compact state space can be constructed through the GPDM learning of high-dimensional time-series motion data to obtain a suitable motion model. We propose a GPDM-based particle filter using a compact state space from the learned motion models to realize efficient estimation of joint angles for robot imitation. Simulations and real experiments demonstrate that the proposed method efficiently estimates humanoid robot joint angles at low computational cost, enabling real-time imitation.
This is an open access article distributed under the Creative Commons Attribution License
which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited