Visual Echolocation Concept for the Colorophone Sensory Substitution Device Using Virtual Reality
1
Consciousness Lab, Institute of Psychology, Jagiellonian University, 30-060 Kraków, Poland
2
Department of Electronic Systems, Norwegian University of Science and Technology, NO-7491 Trondheim, Norway
3
Jagiellonian Human-Centered Artificial Intelligence Laboratory, Jagiellonian University, 30-348 Kraków, Poland
4
Department of Process Control, AGH University of Science and Technology, 30-059 Kraków, Poland
*
Author to whom correspondence should be addressed.
Sensors 2021, 21(1), 237; https://doi.org/10.3390/s21010237
Received: 15 November 2020 / Revised: 16 December 2020 / Accepted: 21 December 2020 / Published: 1 January 2021
(This article belongs to the Special Issue Human-Computer Interaction in Pervasive Computing Environments)
Detecting characteristics of 3D scenes is considered one of the biggest challenges for visually impaired people. This ability is nonetheless crucial for orientation and navigation in the natural environment. Although there are several Electronic Travel Aids aiming at enhancing orientation and mobility for the blind, only a few of them combine passing both 2D and 3D information, including colour. Moreover, existing devices either focus on a small part of an image or allow interpretation of a mere few points in the field of view. Here, we propose a concept of visual echolocation with integrated colour sonification as an extension of Colorophone—an assistive device for visually impaired people. The concept aims at mimicking the process of echolocation and thus provides 2D, 3D and additionally colour information of the whole scene. Even though the final implementation will be realised by a 3D camera, it is first simulated, as a proof of concept, by using VIRCO—a Virtual Reality training and evaluation system for Colorophone. The first experiments showed that it is possible to sonify colour and distance of the whole scene, which opens up a possibility to implement the developed algorithm on a hardware-based stereo camera platform. An introductory user evaluation of the system has been conducted in order to assess the effectiveness of the proposed solution for perceiving distance, position and colour of the objects placed in Virtual Reality.
View Full-Text
Keywords:
3D camera; stereo vision; auditory SSD; distance sonification; colour sonification; 3D scene sonification; virtual reality
▼
Show Figures
This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited
MDPI and ACS Style
Bizoń-Angov, P.; Osiński, D.; Wierzchoń, M.; Konieczny, J. Visual Echolocation Concept for the Colorophone Sensory Substitution Device Using Virtual Reality. Sensors 2021, 21, 237. https://doi.org/10.3390/s21010237
AMA Style
Bizoń-Angov P, Osiński D, Wierzchoń M, Konieczny J. Visual Echolocation Concept for the Colorophone Sensory Substitution Device Using Virtual Reality. Sensors. 2021; 21(1):237. https://doi.org/10.3390/s21010237
Chicago/Turabian StyleBizoń-Angov, Patrycja; Osiński, Dominik; Wierzchoń, Michał; Konieczny, Jarosław. 2021. "Visual Echolocation Concept for the Colorophone Sensory Substitution Device Using Virtual Reality" Sensors 21, no. 1: 237. https://doi.org/10.3390/s21010237
Find Other Styles
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.
Search more from Scilit