Next Article in Journal
Bio-Inspired Distributed Transmission Power Control Considering QoS Fairness in Wireless Body Area Sensor Networks
Next Article in Special Issue
Nighttime Foreground Pedestrian Detection Based on Three-Dimensional Voxel Surface Model
Previous Article in Journal
Toward a Robust Security Paradigm for Bluetooth Low Energy-Based Smart Objects in the Internet-of-Things
Previous Article in Special Issue
CuFusion: Accurate Real-Time Camera Tracking and Volumetric Scene Reconstruction with a Cuboid
Article Menu
Issue 10 (October) cover image

Export Article

Open AccessArticle
Sensors 2017, 17(10), 2346; doi:10.3390/s17102346

Extrinsic Calibration of Camera and 2D Laser Sensors without Overlap

Department of Computer Engineering, the Hashemite University, Zarqa 13115, Jordan
Department of Mechatronics Engineering, the Hashemite University, Zarqa 13115, Jordan
Department of Computer and Information Sciences, Fordham University, New York, NY 10023, USA
Current address: Department of Computer Engineering, the Hashemite University, Zarqa 13115, Jordan.
Author to whom correspondence should be addressed.
Received: 12 September 2017 / Revised: 8 October 2017 / Accepted: 12 October 2017 / Published: 14 October 2017
(This article belongs to the Special Issue Imaging Depth Sensors—Sensors, Algorithms and Applications)
View Full-Text   |   Download PDF [6470 KB, uploaded 16 October 2017]   |  


Extrinsic calibration of a camera and a 2D laser range finder (lidar) sensors is crucial in sensor data fusion applications; for example SLAM algorithms used in mobile robot platforms. The fundamental challenge of extrinsic calibration is when the camera-lidar sensors do not overlap or share the same field of view. In this paper we propose a novel and flexible approach for the extrinsic calibration of a camera-lidar system without overlap, which can be used for robotic platform self-calibration. The approach is based on the robot–world hand–eye calibration (RWHE) problem; proven to have efficient and accurate solutions. First, the system was mapped to the RWHE calibration problem modeled as the linear relationship AX = ZB , where X and Z are unknown calibration matrices. Then, we computed the transformation matrix B , which was the main challenge in the above mapping. The computation is based on reasonable assumptions about geometric structure in the calibration environment. The reliability and accuracy of the proposed approach is compared to a state-of-the-art method in extrinsic 2D lidar to camera calibration. Experimental results from real datasets indicate that the proposed approach provides better results with an L2 norm translational and rotational deviations of 314 mm and 0 . 12 respectively. View Full-Text
Keywords: calibration; range sensing; mobile robot; mapping; 2D lidar sensor calibration; range sensing; mobile robot; mapping; 2D lidar sensor

Figure 1

This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. (CC BY 4.0).

Scifeed alert for new publications

Never miss any articles matching your research from any publisher
  • Get alerts for new papers matching your research
  • Find out the new papers from selected authors
  • Updated daily for 49'000+ journals and 6000+ publishers
  • Define your Scifeed now

SciFeed Share & Cite This Article

MDPI and ACS Style

Ahmad Yousef, K.M.; Mohd, B.J.; Al-Widyan, K.; Hayajneh, T. Extrinsic Calibration of Camera and 2D Laser Sensors without Overlap. Sensors 2017, 17, 2346.

Show more citation formats Show less citations formats

Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Related Articles

Article Metrics

Article Access Statistics



[Return to top]
Sensors EISSN 1424-8220 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top