Next Article in Journal
Spatial–Temporal Variations in NO2 and PM2.5 over the Chengdu–Chongqing Economic Zone in China during 2005–2015 Based on Satellite Remote Sensing
Previous Article in Journal
A Cyclic Vernier Two-Step TDC for High Input Range Time-of-Flight Sensor Using Startup Time Correction Technique
Article Menu
Issue 11 (November) cover image

Export Article

Open AccessArticle

Simultaneous Robot–World and Hand–Eye Calibration without a Calibration Object

1
Institute of Information Photonics and Optical Communications, Beijing University of Posts and Telecommunications, Beijing 100876, China
2
Key Laboratory of the Ministry of Education for Optoelectronic Measurement Technology and Instrument, Beijing Information Science and Technology University, Beijing 100192, China
*
Author to whom correspondence should be addressed.
Sensors 2018, 18(11), 3949; https://doi.org/10.3390/s18113949
Received: 9 August 2018 / Revised: 29 October 2018 / Accepted: 5 November 2018 / Published: 15 November 2018
(This article belongs to the Section Physical Sensors)
  |  
PDF [2931 KB, uploaded 15 November 2018]
  |  

Abstract

An extended robot–world and hand–eye calibration method is proposed in this paper to evaluate the transformation relationship between the camera and robot device. This approach could be performed for mobile or medical robotics applications, where precise, expensive, or unsterile calibration objects, or enough movement space, cannot be made available at the work site. Firstly, a mathematical model is established to formulate the robot-gripper-to-camera rigid transformation and robot-base-to-world rigid transformation using the Kronecker product. Subsequently, a sparse bundle adjustment is introduced for the optimization of robot–world and hand–eye calibration, as well as reconstruction results. Finally, a validation experiment including two kinds of real data sets is designed to demonstrate the effectiveness and accuracy of the proposed approach. The translation relative error of rigid transformation is less than 8/10,000 by a Denso robot in a movement range of 1.3 m × 1.3 m × 1.2 m. The distance measurement mean error after three-dimensional reconstruction is 0.13 mm. View Full-Text
Keywords: robot–world calibration; hand–eye calibration; calibration object; Kronecker product; sparse bundle adjustment robot–world calibration; hand–eye calibration; calibration object; Kronecker product; sparse bundle adjustment
Figures

Figure 1

This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited (CC BY 4.0).
SciFeed

Share & Cite This Article

MDPI and ACS Style

Li, W.; Dong, M.; Lu, N.; Lou, X.; Sun, P. Simultaneous Robot–World and Hand–Eye Calibration without a Calibration Object. Sensors 2018, 18, 3949.

Show more citation formats Show less citations formats

Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Related Articles

Article Metrics

Article Access Statistics

1

Comments

[Return to top]
Sensors EISSN 1424-8220 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top