Next Article in Journal
IoT-Based Research Equipment Sharing System for Remotely Controlled Two-Photon Laser Scanning Microscopy
Previous Article in Journal
The Highly Uniform Photoresponsivity from Visible to Near IR Light in Sb2Te3 Flakes
Previous Article in Special Issue
Personalized Human Activity Recognition Based on Integrated Wearable Sensor and Transfer Learning
Open AccessArticle

A Wearable Navigation Device for Visually Impaired People Based on the Real-Time Semantic Visual SLAM System

1
Key Laboratory of Biomimetic Robots and Systems, Ministry of Education, State Key Laboratory of Intelligent Control and Decision of Complex System, Beijing Advanced Innovation Center for Intelligent Robots and Systems, and School of Mechatronical Engineering, Beijing Institute of Technology, Beijing 100081, China
2
Department of Materials Engineering Science, Osaka University, Osaka 560-8531, Japan
3
Global Alliance Laboratory, The University of Electro-Communications, Tokyo 182-8585, Japan
*
Author to whom correspondence should be addressed.
Academic Editor: Samer Mohammed
Sensors 2021, 21(4), 1536; https://doi.org/10.3390/s21041536
Received: 30 November 2020 / Revised: 28 January 2021 / Accepted: 17 February 2021 / Published: 23 February 2021
(This article belongs to the Special Issue Wearable Sensor for Activity Analysis and Context Recognition)
Wearable auxiliary devices for visually impaired people are highly attractive research topics. Although many proposed wearable navigation devices can assist visually impaired people in obstacle avoidance and navigation, these devices cannot feedback detailed information about the obstacles or help the visually impaired understand the environment. In this paper, we proposed a wearable navigation device for the visually impaired by integrating the semantic visual SLAM (Simultaneous Localization And Mapping) and the newly launched powerful mobile computing platform. This system uses an Image-Depth (RGB-D) camera based on structured light as the sensor, as the control center. We also focused on the technology that combines SLAM technology with the extraction of semantic information from the environment. It ensures that the computing platform understands the surrounding environment in real-time and can feed it back to the visually impaired in the form of voice broadcast. Finally, we tested the performance of the proposed semantic visual SLAM system on this device. The results indicate that the system can run in real-time on a wearable navigation device with sufficient accuracy. View Full-Text
Keywords: wearable device; semantic segmentation; SLAM; assistance for visually impaired people; localization; semantic map wearable device; semantic segmentation; SLAM; assistance for visually impaired people; localization; semantic map
Show Figures

Figure 1

MDPI and ACS Style

Chen, Z.; Liu, X.; Kojima, M.; Huang, Q.; Arai, T. A Wearable Navigation Device for Visually Impaired People Based on the Real-Time Semantic Visual SLAM System. Sensors 2021, 21, 1536. https://doi.org/10.3390/s21041536

AMA Style

Chen Z, Liu X, Kojima M, Huang Q, Arai T. A Wearable Navigation Device for Visually Impaired People Based on the Real-Time Semantic Visual SLAM System. Sensors. 2021; 21(4):1536. https://doi.org/10.3390/s21041536

Chicago/Turabian Style

Chen, Zhuo; Liu, Xiaoming; Kojima, Masaru; Huang, Qiang; Arai, Tatsuo. 2021. "A Wearable Navigation Device for Visually Impaired People Based on the Real-Time Semantic Visual SLAM System" Sensors 21, no. 4: 1536. https://doi.org/10.3390/s21041536

Find Other Styles
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Article Access Map by Country/Region

1
Search more from Scilit
 
Search
Back to TopTop