Design of Demonstration-Driven Assembling Manipulator
AbstractCurrently, a mechanical arm or manipulator needs to be programmed by humans in advance to define its motion trajectory before practical use. However, the programming is tedious and high-cost, which renders such manipulators unable to perform various different tasks easily and quickly. This article focuses on the design of a vision-guided manipulator without explicit human programming. The proposed demonstration-driven system mainly consists of a manipulator, control box, and camera. Instead of programming of the detailed motion trajectory, one only needs to show the system how to perform a given task manually. Based on internal object recognition and motion detection algorithms, the camera can capture the information of the task to be performed and generate the motion trajectories for the manipulator to make it copy the human demonstration. The movement of the joints of the manipulator is given by a trajectory planner in the control box. Experimental results show that the system can imitate humans easily, quickly, and accurately for common tasks such as sorting and assembling objects. Teaching the manipulator how to complete the desired motion can help eliminate the complexity of programming for motion control. View Full-Text
Share & Cite This Article
Wei, Q.; Yang, C.; Fan, W.; Zhao, Y. Design of Demonstration-Driven Assembling Manipulator. Appl. Sci. 2018, 8, 797.
Wei Q, Yang C, Fan W, Zhao Y. Design of Demonstration-Driven Assembling Manipulator. Applied Sciences. 2018; 8(5):797.Chicago/Turabian Style
Wei, Qianxiao; Yang, Canjun; Fan, Wu; Zhao, Yibing. 2018. "Design of Demonstration-Driven Assembling Manipulator." Appl. Sci. 8, no. 5: 797.
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.