Next Article in Journal
Assistive Technologies for Supporting the Wellbeing of Older Adults
Previous Article in Journal
A Simplified Tantalum Oxide Memristor Model, Parameters Estimation and Application in Memory Crossbars
Previous Article in Special Issue
Does One Size Fit All? A Case Study to Discuss Findings of an Augmented Hands-Free Robot Teleoperation Concept for People with and without Motor Disabilities
Article

A Simulated Environment for Robot Vision Experiments

1
Department of Computer Science and Computer Engineering, University of Texas at Arlington, Arlington, TX 76019, USA
2
Institute of Informatics and Telecommunications, National Centre for Scientific Research ‘Demokritos’, 15341 Agia Paraskevi, Greece
*
Author to whom correspondence should be addressed.
This paper is an extended version of our paper published in PETRA 2021, doi:10.1145/3453892.3462214.
Academic Editor: Tomohiro Fukuda
Technologies 2022, 10(1), 7; https://doi.org/10.3390/technologies10010007
Received: 30 November 2021 / Revised: 31 December 2021 / Accepted: 5 January 2022 / Published: 12 January 2022
(This article belongs to the Collection Selected Papers from the PETRA Conference Series)
Training on simulation data has proven invaluable in applying machine learning in robotics. However, when looking at robot vision in particular, simulated images cannot be directly used no matter how realistic the image rendering is, as many physical parameters (temperature, humidity, wear-and-tear in time) vary and affect texture and lighting in ways that cannot be encoded in the simulation. In this article we propose a different approach for extracting value from simulated environments: although neither of the trained models can be used nor are any evaluation scores expected to be the same on simulated and physical data, the conclusions drawn from simulated experiments might be valid. If this is the case, then simulated environments can be used in early-stage experimentation with different network architectures and features. This will expedite the early development phase before moving to (harder to conduct) physical experiments in order to evaluate the most promising approaches. In order to test this idea we created two simulated environments for the Unity engine, acquired simulated visual datasets, and used them to reproduce experiments originally carried out in a physical environment. The comparison of the conclusions drawn in the physical and the simulated experiments is promising regarding the validity of our approach. View Full-Text
Keywords: robot perception; machine learning; traversability estimation robot perception; machine learning; traversability estimation
Show Figures

Figure 1

MDPI and ACS Style

Sevastopoulos, C.; Konstantopoulos, S.; Balaji, K.; Zaki Zadeh, M.; Makedon, F. A Simulated Environment for Robot Vision Experiments. Technologies 2022, 10, 7. https://doi.org/10.3390/technologies10010007

AMA Style

Sevastopoulos C, Konstantopoulos S, Balaji K, Zaki Zadeh M, Makedon F. A Simulated Environment for Robot Vision Experiments. Technologies. 2022; 10(1):7. https://doi.org/10.3390/technologies10010007

Chicago/Turabian Style

Sevastopoulos, Christos, Stasinos Konstantopoulos, Keshav Balaji, Mohammad Zaki Zadeh, and Fillia Makedon. 2022. "A Simulated Environment for Robot Vision Experiments" Technologies 10, no. 1: 7. https://doi.org/10.3390/technologies10010007

Find Other Styles
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Article Access Map by Country/Region

1
Back to TopTop