Next Article in Journal
Current Trends and Challenges in Pediatric Access to Sensorless and Sensor-Based Upper Limb Exoskeletons
Previous Article in Journal
IoT Sensing for Reality-Enhanced Serious Games, a Fuel-Efficient Drive Use Case
Previous Article in Special Issue
Active Game-Based Solutions for the Treatment of Childhood Obesity
Article

Seeing through Events: Real-Time Moving Object Sonification for Visually Impaired People Using Event-Based Camera

1
National Engineering Research Center of Optical Instrumentation, Zhejiang University, Hangzhou 310058, China
2
Institute for Anthropomatics and Robotics, Karlsruhe Institute of Technology, 76131 Karlsruhe, Germany
*
Author to whom correspondence should be addressed.
Academic Editor: Francisco José García-Peñalvo
Sensors 2021, 21(10), 3558; https://doi.org/10.3390/s21103558
Received: 7 April 2021 / Revised: 11 May 2021 / Accepted: 12 May 2021 / Published: 20 May 2021
(This article belongs to the Special Issue Sensors and Technological Ecosystems for eHealth)
Scene sonification is a powerful technique to help Visually Impaired People (VIP) understand their surroundings. Existing methods usually perform sonification on the entire images of the surrounding scene acquired by a standard camera or on the priori static obstacles acquired by image processing algorithms on the RGB image of the surrounding scene. However, if all the information in the scene are delivered to VIP simultaneously, it will cause information redundancy. In fact, biological vision is more sensitive to moving objects in the scene than static objects, which is also the original intention of the event-based camera. In this paper, we propose a real-time sonification framework to help VIP understand the moving objects in the scene. First, we capture the events in the scene using an event-based camera and cluster them into multiple moving objects without relying on any prior knowledge. Then, sonification based on MIDI is enabled on these objects synchronously. Finally, we conduct comprehensive experiments on the scene video with sonification audio attended by 20 VIP and 20 Sighted People (SP). The results show that our method allows both participants to clearly distinguish the number, size, motion speed, and motion trajectories of multiple objects. The results show that our method is more comfortable to hear than existing methods in terms of aesthetics. View Full-Text
Keywords: event-based camera; computer vision for visually impaired people; sonification; unsupervised object tracking event-based camera; computer vision for visually impaired people; sonification; unsupervised object tracking
Show Figures

Figure 1

MDPI and ACS Style

Ji, Z.; Hu, W.; Wang, Z.; Yang, K.; Wang, K. Seeing through Events: Real-Time Moving Object Sonification for Visually Impaired People Using Event-Based Camera. Sensors 2021, 21, 3558. https://doi.org/10.3390/s21103558

AMA Style

Ji Z, Hu W, Wang Z, Yang K, Wang K. Seeing through Events: Real-Time Moving Object Sonification for Visually Impaired People Using Event-Based Camera. Sensors. 2021; 21(10):3558. https://doi.org/10.3390/s21103558

Chicago/Turabian Style

Ji, Zihao; Hu, Weijian; Wang, Ze; Yang, Kailun; Wang, Kaiwei. 2021. "Seeing through Events: Real-Time Moving Object Sonification for Visually Impaired People Using Event-Based Camera" Sensors 21, no. 10: 3558. https://doi.org/10.3390/s21103558

Find Other Styles
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Article Access Map by Country/Region

1
Search more from Scilit
 
Search
Back to TopTop