Next Article in Journal
Contact Resonance Atomic Force Microscopy Using Long, Massive Tips
Previous Article in Journal
An Analytical Model for Interference Alignment in Broadcast Assisted VANETs
Open AccessArticle

Automatic Multi-Camera Extrinsic Parameter Calibration Based on Pedestrian Torsors

1
TELIN-IPI, Ghent University—imec, St-Pietersnieuwstraat 41, B-9000 Gent, Belgium
2
ETRO Department, Vrije Universiteit Brussel—imec, Pleinlaan 2, B-1050 Brussels, Belgium
3
CETC Key Laboratory of Aerospace Information Applications, Shijiazhuang 050000, China
*
Author to whom correspondence should be addressed.
This paper is an extended version of our paper published in Anh Minh Truong, Wilfried Philips, Junzhi Guan, Nikos Deligiannis, and Lusine Abrahamyan. Automatic Extrinsic Calibration of Camera Networks Based on Pedestrians. In Proceedings of International Conference on Distributed Smart Cameras (ICDSC 2019), Trento, Italy, 9–11 September 2019.
These authors contributed equally to this work.
Sensors 2019, 19(22), 4989; https://doi.org/10.3390/s19224989
Received: 25 September 2019 / Revised: 4 November 2019 / Accepted: 12 November 2019 / Published: 15 November 2019
(This article belongs to the Special Issue Cooperative Camera Networks)
Extrinsic camera calibration is essential for any computer vision task in a camera network. Typically, researchers place a calibration object in the scene to calibrate all the cameras in a camera network. However, when installing cameras in the field, this approach can be costly and impractical, especially when recalibration is needed. This paper proposes a novel, accurate and fully automatic extrinsic calibration framework for camera networks with partially overlapping views. The proposed method considers the pedestrians in the observed scene as the calibration objects and analyzes the pedestrian tracks to obtain extrinsic parameters. Compared to the state of the art, the new method is fully automatic and robust in various environments. Our method detect human poses in the camera images and then models walking persons as vertical sticks. We apply a brute-force method to determines the correspondence between persons in multiple camera images. This information along with 3D estimated locations of the top and the bottom of the pedestrians are then used to compute the extrinsic calibration matrices. We also propose a novel method to calibrate the camera network by only using the top and centerline of the person when the bottom of the person is not available in heavily occluded scenes. We verified the robustness of the method in different camera setups and for both single and multiple walking people. The results show that the triangulation error of a few centimeters can be obtained. Typically, it requires less than one minute of observing the walking people to reach this accuracy in controlled environments. It also just takes a few minutes to collect enough data for the calibration in uncontrolled environments. Our proposed method can perform well in various situations such as multi-person, occlusions, or even at real intersections on the street. View Full-Text
Keywords: extrinsic calibration; camera network; pedestrians extrinsic calibration; camera network; pedestrians
Show Figures

Figure 1

MDPI and ACS Style

Truong, A.M.; Philips, W.; Deligiannis, N.; Abrahamyan, L.; Guan, J. Automatic Multi-Camera Extrinsic Parameter Calibration Based on Pedestrian Torsors . Sensors 2019, 19, 4989.

Show more citation formats Show less citations formats
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Article Access Map by Country/Region

1
Back to TopTop