Next Article in Journal
Analysis and Evaluation of the Image Preprocessing Process of a Six-Band Multispectral Camera Mounted on an Unmanned Aerial Vehicle for Winter Wheat Monitoring
Next Article in Special Issue
FeinPhone: Low-cost Smartphone Camera-based 2D Particulate Matter Sensor
Previous Article in Journal
Research on Damage Detection of a 3D Steel Frame Model Using Smartphones
Previous Article in Special Issue
A Piezoelectric Sensor Signal Analysis Method for Identifying Persons Groups
Open AccessArticle

Three-Dimensional Visualization System with Spatial Information for Navigation of Tele-Operated Robots

1
Graduate School of Convergence Science and Technology, Seoul National University, Seoul KR 08826, Korea
2
Intelligent Robotics Research Center, Korea Electronics Technology Institute, Bucheon KR 14502, Korea
3
Advanced Institutes of Convergence Technology, Suwon KR16229, Korea
*
Author to whom correspondence should be addressed.
Sensors 2019, 19(3), 746; https://doi.org/10.3390/s19030746
Received: 28 December 2018 / Revised: 18 January 2019 / Accepted: 29 January 2019 / Published: 12 February 2019
(This article belongs to the Special Issue Sensors Signal Processing and Visual Computing)
  |  
PDF [19797 KB, uploaded 14 February 2019]
  |     |  

Abstract

This study describes a three-dimensional visualization system with spatial information for the effective control of a tele-operated robot. The environmental visualization system for operating the robot is very important. The tele-operated robot performs tasks in a disaster area that is not accessible to humans. The visualization system should perform in real-time to cope with rapidly changing situations. The visualization system should also provide accurate and high-level information so that the tele-operator can make the right decisions. The proposed system consists of four fisheye cameras and a 360° laser scanner. When the robot moves to the unknown space, a spatial model is created using the spatial information data of the laser scanner, and a single-stitched image is created using four images from cameras and mapped in real-time. The visualized image contains the surrounding spatial information; hence, the tele-operator can not only grasp the surrounding space easily, but also knows the relative position of the robot in space. In addition, it provides various angles of view without moving the robot or sensor, thereby coping with various situations. The experimental results show that the proposed method has a more natural appearance than the conventional methods. View Full-Text
Keywords: 3D visualization; wrap around view monitoring; robot vision systems; tele-operated robots 3D visualization; wrap around view monitoring; robot vision systems; tele-operated robots
Figures

Figure 1

This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited (CC BY 4.0).
SciFeed

Share & Cite This Article

MDPI and ACS Style

Kim, S.-H.; Jung, C.; Park, J. Three-Dimensional Visualization System with Spatial Information for Navigation of Tele-Operated Robots. Sensors 2019, 19, 746.

Show more citation formats Show less citations formats

Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Related Articles

Article Metrics

Article Access Statistics

1

Comments

[Return to top]
Sensors EISSN 1424-8220 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top