Next Article in Journal
Using UHF RFID Properties to Develop and Optimize an Upper-Limb Rehabilitation System
Previous Article in Journal
Multipath Map Method for TDOA Based Indoor Reverse Positioning System with Improved Chan-Taylor Algorithm
Previous Article in Special Issue
A Method for Measuring the Height of Hand Movements Based on a Planar Array of Electrostatic Induction Electrodes
Open AccessArticle

A Comparative Study in Real-Time Scene Sonification for Visually Impaired People

1
National Engineering Research Center of Optical Instrumentation, Zhejiang University, Hangzhou 310058, China
2
Institute for Anthropomatics and Robotics, Karlsruhe Institute of Technology, 76131 Karlsruhe, Germany
3
School of Computing and Engineering, University of Huddersfield, Huddersfield HD1 3DH, UK
*
Author to whom correspondence should be addressed.
Sensors 2020, 20(11), 3222; https://doi.org/10.3390/s20113222
Received: 25 April 2020 / Revised: 1 June 2020 / Accepted: 2 June 2020 / Published: 5 June 2020
(This article belongs to the Special Issue Human-Machine Interaction and Sensors)
In recent years, with the development of depth cameras and scene detection algorithms, a wide variety of electronic travel aids for visually impaired people have been proposed. However, it is still challenging to convey scene information to visually impaired people efficiently. In this paper, we propose three different auditory-based interaction methods, i.e., depth image sonification, obstacle sonification as well as path sonification, which convey raw depth images, obstacle information and path information respectively to visually impaired people. Three sonification methods are compared comprehensively through a field experiment attended by twelve visually impaired participants. The results show that the sonification of high-level scene information, such as the direction of pathway, is easier to learn and adapt, and is more suitable for point-to-point navigation. In contrast, through the sonification of low-level scene information, such as raw depth images, visually impaired people can understand the surrounding environment more comprehensively. Furthermore, there is no interaction method that is best suited for all participants in the experiment, and visually impaired individuals need a period of time to find the most suitable interaction method. Our findings highlight the features and the differences of three scene detection algorithms and the corresponding sonification methods. The results provide insights into the design of electronic travel aids, and the conclusions can also be applied in other fields, such as the sound feedback of virtual reality applications. View Full-Text
Keywords: electronic travel aid; visually impaired people; sonification; scene detection electronic travel aid; visually impaired people; sonification; scene detection
Show Figures

Figure 1

MDPI and ACS Style

Hu, W.; Wang, K.; Yang, K.; Cheng, R.; Ye, Y.; Sun, L.; Xu, Z. A Comparative Study in Real-Time Scene Sonification for Visually Impaired People. Sensors 2020, 20, 3222.

Show more citation formats Show less citations formats
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Article Access Map by Country/Region

1
Search more from Scilit
 
Search
Back to TopTop