Next Article in Journal
Development and Validation of 2D-LiDAR-Based Gait Analysis Instrument and Algorithm
Previous Article in Journal
Prediction of Motion Intentions as a Novel Method of Upper Limb Rehabilitation Support
Open AccessTechnical Note

Evaluation of the Azure Kinect and Its Comparison to Kinect V1 and Kinect V2

Institute of Robotics and Cybernetics, Faculty of Electrical Engineering and Information Technology STU in Bratislava, Ilkovičova 3, 812 19 Bratislava, Slovakia
*
Author to whom correspondence should be addressed.
Sensors 2021, 21(2), 413; https://doi.org/10.3390/s21020413
Received: 27 October 2020 / Revised: 14 December 2020 / Accepted: 4 January 2021 / Published: 8 January 2021
(This article belongs to the Section Physical Sensors)
The Azure Kinect is the successor of Kinect v1 and Kinect v2. In this paper we perform brief data analysis and comparison of all Kinect versions with focus on precision (repeatability) and various aspects of noise of these three sensors. Then we thoroughly evaluate the new Azure Kinect; namely its warm-up time, precision (and sources of its variability), accuracy (thoroughly, using a robotic arm), reflectivity (using 18 different materials), and the multipath and flying pixel phenomenon. Furthermore, we validate its performance in both indoor and outdoor environments, including direct and indirect sun conditions. We conclude with a discussion on its improvements in the context of the evolution of the Kinect sensor. It was shown that it is crucial to choose well designed experiments to measure accuracy, since the RGB and depth camera are not aligned. Our measurements confirm the officially stated values, namely standard deviation ≤17 mm, and distance error <11 mm in up to 3.5 m distance from the sensor in all four supported modes. The device, however, has to be warmed up for at least 40–50 min to give stable results. Due to the time-of-flight technology, the Azure Kinect cannot be reliably used in direct sunlight. Therefore, it is convenient mostly for indoor applications. View Full-Text
Keywords: Kinect; Azure Kinect; robotics; mapping; SLAM (simultaneous localization and mapping); HRI (human–robot interaction); 3D scanning; depth imaging; object recognition; gesture recognition Kinect; Azure Kinect; robotics; mapping; SLAM (simultaneous localization and mapping); HRI (human–robot interaction); 3D scanning; depth imaging; object recognition; gesture recognition
Show Figures

Figure 1

MDPI and ACS Style

Tölgyessy, M.; Dekan, M.; Chovanec, Ľ.; Hubinský, P. Evaluation of the Azure Kinect and Its Comparison to Kinect V1 and Kinect V2. Sensors 2021, 21, 413. https://doi.org/10.3390/s21020413

AMA Style

Tölgyessy M, Dekan M, Chovanec Ľ, Hubinský P. Evaluation of the Azure Kinect and Its Comparison to Kinect V1 and Kinect V2. Sensors. 2021; 21(2):413. https://doi.org/10.3390/s21020413

Chicago/Turabian Style

Tölgyessy, Michal; Dekan, Martin; Chovanec, Ľuboš; Hubinský, Peter. 2021. "Evaluation of the Azure Kinect and Its Comparison to Kinect V1 and Kinect V2" Sensors 21, no. 2: 413. https://doi.org/10.3390/s21020413

Find Other Styles
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Article Access Map by Country/Region

1
Search more from Scilit
 
Search
Back to TopTop