Open AccessThis article is
- freely available
Improvement of KinectTM Sensor Capabilities by Fusion with Laser Sensing Data Using Octree
Århus School of Engineering, Århus University Finlandsgade 22, 8200 Århus N, Denmark
* Author to whom correspondence should be addressed.
Received: 21 January 2012; in revised form: 20 February 2012 / Accepted: 21 February 2012 / Published: 26 March 2012
Abstract: To enhance sensor capabilities, sensor data readings from different modalities must be fused. The main contribution of this paper is to present a sensor data fusion approach that can reduce KinectTM sensor limitations. This approach involves combining laser with KinectTM sensors. Sensor data is modelled in a 3D environment based on octrees using a probabilistic occupancy estimation. The Bayesian method, which takes into account the uncertainty inherent in the sensor measurements, is used to fuse the sensor information and update the 3D octree map. The sensor fusion yields a significant increase of the field of view of the KinectTM sensor that can be used for robot tasks.
Keywords: sensor fusion; laser; KinectTM ; 3D octree map; collaboration
Article StatisticsClick here to load and display the download statistics.
Notes: Multiple requests from the same IP address are counted as one view.
Cite This Article
MDPI and ACS Style
Chávez, A.; Karstoft, H. Improvement of KinectTM Sensor Capabilities by Fusion with Laser Sensing Data Using Octree. Sensors 2012, 12, 3868-3878.
Chávez A, Karstoft H. Improvement of KinectTM Sensor Capabilities by Fusion with Laser Sensing Data Using Octree. Sensors. 2012; 12(4):3868-3878.
Chávez, Alfredo; Karstoft, Henrik. 2012. "Improvement of KinectTM Sensor Capabilities by Fusion with Laser Sensing Data Using Octree." Sensors 12, no. 4: 3868-3878.