Next Article in Journal / Special Issue
Machine Learning and Computer Vision System for Phenotype Data Acquisition and Analysis in Plants
Previous Article in Journal
Fano Resonance Based on Metal-Insulator-Metal Waveguide-Coupled Double Rectangular Cavities for Plasmonic Nanosensors
Previous Article in Special Issue
Assistant Personal Robot (APR): Conception and Application of a Tele-Operated Assisted Living Robot
Article Menu

Export Article

Open AccessArticle
Sensors 2016, 16(5), 640; doi:10.3390/s16050640

3D Visual Data-Driven Spatiotemporal Deformations for Non-Rigid Object Grasping Using Robot Hands

1
Computer Science Research Institute, University of Alicante, San Vicente del Raspeig, Alicante 03690, Spain
2
Physics, Systems Engineering and Signal Theory Department, University of Alicante, San Vicente del Raspeig, Alicante 03690, Spain
*
Author to whom correspondence should be addressed.
Academic Editor: Gonzalo Pajares Martinsanz
Received: 22 December 2015 / Revised: 27 April 2016 / Accepted: 29 April 2016 / Published: 5 May 2016
(This article belongs to the Special Issue State-of-the-Art Sensors Technology in Spain 2015)
View Full-Text   |   Download PDF [8415 KB, uploaded 5 May 2016]   |  

Abstract

Sensing techniques are important for solving problems of uncertainty inherent to intelligent grasping tasks. The main goal here is to present a visual sensing system based on range imaging technology for robot manipulation of non-rigid objects. Our proposal provides a suitable visual perception system of complex grasping tasks to support a robot controller when other sensor systems, such as tactile and force, are not able to obtain useful data relevant to the grasping manipulation task. In particular, a new visual approach based on RGBD data was implemented to help a robot controller carry out intelligent manipulation tasks with flexible objects. The proposed method supervises the interaction between the grasped object and the robot hand in order to avoid poor contact between the fingertips and an object when there is neither force nor pressure data. This new approach is also used to measure changes to the shape of an object’s surfaces and so allows us to find deformations caused by inappropriate pressure being applied by the hand’s fingers. Test was carried out for grasping tasks involving several flexible household objects with a multi-fingered robot hand working in real time. Our approach generates pulses from the deformation detection method and sends an event message to the robot controller when surface deformation is detected. In comparison with other methods, the obtained results reveal that our visual pipeline does not use deformations models of objects and materials, as well as the approach works well both planar and 3D household objects in real time. In addition, our method does not depend on the pose of the robot hand because the location of the reference system is computed from a recognition process of a pattern located place at the robot forearm. The presented experiments demonstrate that the proposed method accomplishes a good monitoring of grasping task with several objects and different grasping configurations in indoor environments. View Full-Text
Keywords: visual perception; vision algorithms for grasping; 3D-object recognition; sensing for robot manipulation visual perception; vision algorithms for grasping; 3D-object recognition; sensing for robot manipulation
Figures

This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. (CC BY 4.0).

Scifeed alert for new publications

Never miss any articles matching your research from any publisher
  • Get alerts for new papers matching your research
  • Find out the new papers from selected authors
  • Updated daily for 49'000+ journals and 6000+ publishers
  • Define your Scifeed now

SciFeed Share & Cite This Article

MDPI and ACS Style

Mateo, C.M.; Gil, P.; Torres, F. 3D Visual Data-Driven Spatiotemporal Deformations for Non-Rigid Object Grasping Using Robot Hands. Sensors 2016, 16, 640.

Show more citation formats Show less citations formats

Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Related Articles

Article Metrics

Article Access Statistics

1

Comments

[Return to top]
Sensors EISSN 1424-8220 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top