Next Article in Journal
Light Exposure Effects on the DC Kink of AlGaN/GaN HEMTs
Previous Article in Journal
Accurately Modeling of Zero Biased Schottky-Diodes at Millimeter-Wave Frequencies
Previous Article in Special Issue
WSMS: Wearable Stress Monitoring System Based on IoT Multi-Sensor Platform for Living Sheep Transportation
Article Menu
Issue 6 (June) cover image

Export Article

Open AccessArticle

Wearable Travel Aid for Environment Perception and Navigation of Visually Impaired People

1
School of Electronic Information Engineering, Beihang University, No. 37, Xueyuan Rd., Haidian Distrct, Beijing 10083, China
2
Department of AI, CloudMinds Technologies Inc., Beijing 100102, China
3
China Academy of Telecommunication Technology, Beijing 10083, China
*
Author to whom correspondence should be addressed.
Electronics 2019, 8(6), 697; https://doi.org/10.3390/electronics8060697
Received: 31 May 2019 / Revised: 18 June 2019 / Accepted: 19 June 2019 / Published: 20 June 2019
(This article belongs to the Special Issue Wearable Electronic Devices)
  |  
PDF [5643 KB, uploaded 20 June 2019]
  |  

Abstract

Assistive devices for visually impaired people (VIP) which support daily traveling and improve social inclusion are developing fast. Most of them try to solve the problem of navigation or obstacle avoidance, and other works focus on helping VIP to recognize their surrounding objects. However, very few of them couple both capabilities (i.e., navigation and recognition). Aiming at the above needs, this paper presents a wearable assistive device that allows VIP to (i) navigate safely and quickly in unfamiliar environment, and (ii) to recognize the objects in both indoor and outdoor environments. The device consists of a consumer Red, Green, Blue and Depth (RGB-D) camera and an Inertial Measurement Unit (IMU), which are mounted on a pair of eyeglasses, and a smartphone. The device leverages the ground height continuity among adjacent image frames to segment the ground accurately and rapidly, and then search the moving direction according to the ground. A lightweight Convolutional Neural Network (CNN)-based object recognition system is developed and deployed on the smartphone to increase the perception ability of VIP and promote the navigation system. It can provide the semantic information of surroundings, such as the categories, locations, and orientations of objects. Human–machine interaction is performed through audio module (a beeping sound for obstacle alert, speech recognition for understanding the user commands, and speech synthesis for expressing semantic information of surroundings). We evaluated the performance of the proposed system through many experiments conducted in both indoor and outdoor scenarios, demonstrating the efficiency and safety of the proposed assistive system. View Full-Text
Keywords: wearable assistive device; blind navigation; object recognition; visually impaired people; ground segmentation wearable assistive device; blind navigation; object recognition; visually impaired people; ground segmentation
Figures

Figure 1

This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited (CC BY 4.0).
SciFeed

Share & Cite This Article

MDPI and ACS Style

Bai, J.; Liu, Z.; Lin, Y.; Li, Y.; Lian, S.; Liu, D. Wearable Travel Aid for Environment Perception and Navigation of Visually Impaired People. Electronics 2019, 8, 697.

Show more citation formats Show less citations formats

Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Related Articles

Article Metrics

Article Access Statistics

1

Comments

[Return to top]
Electronics EISSN 2079-9292 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top