Abstract
Recent advances in bioinspired robotics highlight the growing demand for dexterous, adaptive control strategies that allow robots to interact naturally, safely, and efficiently with dynamic, contact-rich environments. Yet, achieving robust adaptability and reflex-like responsiveness to unpredictable disturbances remains a fundamental challenge. This paper presents a bioinspired imitation learning framework that models human adaptive dynamics to jointly acquire and generalize motion and force skills, enabling compliant and resilient robot behavior. The proposed framework integrates hybrid force–motion learning with dynamic response mechanisms, achieving broad skill generalization without reliance on external sensing modalities. A momentum-based force observer is combined with dynamic movement primitives (DMPs) to enable accurate force estimation and smooth motion coordination, while a broad learning system (BLS) refines the DMP forcing function through style modulation, feature augmentation, and adaptive weight tuning. In addition, an adaptive radial basis function neural network (RBFNN) controller dynamically adjusts control parameters to ensure precise, low-latency skill reproduction, and safe physical interaction. Simulations and real-world experiments confirm that the proposed framework achieves human-like adaptability, robustness, and scalability, attaining a competitive learning time of 5.56 s and a rapid generation time of 0.036 s, thereby demonstrating its efficiency and practicality for real-time applications and offering a lightweight yet powerful solution for bioinspired intelligent control in complex and unstructured environments.