Next Article in Journal
Novel End-to-End CNN Approach for Fault Diagnosis in Electromechanical Systems Based on Relevant Heating Areas in Thermography
Previous Article in Journal
Sustainable GIoT-Based Mangrove Monitoring System for Smart Coastal Cities with Energy Harvesting from SMFCs
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
This is an early access version, the complete PDF, HTML, and XML versions will be available soon.
Article

A Multi-Platform Electronic Travel Aid Integrating Proxemic Sensing for the Visually Impaired

by
Nathan Naidoo
*,† and
Mehrdad Ghaziasgar
*,†
Department of Computer Science, University of the Western Cape, Bellville 7535, South Africa
*
Authors to whom correspondence should be addressed.
These authors contributed equally to this work.
Technologies 2025, 13(12), 550; https://doi.org/10.3390/technologies13120550
Submission received: 27 October 2025 / Revised: 16 November 2025 / Accepted: 24 November 2025 / Published: 26 November 2025
(This article belongs to the Section Assistive Technologies)

Abstract

Visual impairment (VI) affects over two billion people globally, with prevalence increasing due to preventable conditions. To address mobility and navigation challenges, this study presents a multi-platform, multi-sensor Electronic Travel Aid (ETA) integrating a combination of ultrasonic, LiDAR, and vision-based sensing across head-, torso-, and cane-mounted nodes. Grounded in orientation and mobility (OM) principles, the system delivers context-aware haptic and auditory feedback to enhance perception and independence for users with VI. The ETA employs a hardware–software co-design approach guided by proxemic theory, comprising three autonomous components—Glasses, Belt, and Cane nodes—each optimized for a distinct spatial zone while maintaining overlap for redundancy. Embedded ESP32 microcontrollers enable low-latency sensor fusion providing real-time multi-modal user feedback. Static and dynamic experiments using a custom-built motion rig evaluated detection accuracy and feedback latency under repeatable laboratory conditions. Results demonstrate millimetre-level accuracy and sub-30 ms proximity-to-feedback latency across all nodes. The Cane node’s dual LiDAR achieved a coefficient of variation at most 0.04%, while the Belt and Glasses nodes maintained mean detection errors below 1%. The validated tri-modal ETA architecture establishes a scalable, resilient framework for safe, real-time navigation—advancing sensory augmentation for individuals with VI.
Keywords: visual impairment; electronic travel aid; orientation and mobility; LiDAR; ultrasonic sensing; sensor fusion; assistive technology; proxemic framework visual impairment; electronic travel aid; orientation and mobility; LiDAR; ultrasonic sensing; sensor fusion; assistive technology; proxemic framework

Share and Cite

MDPI and ACS Style

Naidoo, N.; Ghaziasgar, M. A Multi-Platform Electronic Travel Aid Integrating Proxemic Sensing for the Visually Impaired. Technologies 2025, 13, 550. https://doi.org/10.3390/technologies13120550

AMA Style

Naidoo N, Ghaziasgar M. A Multi-Platform Electronic Travel Aid Integrating Proxemic Sensing for the Visually Impaired. Technologies. 2025; 13(12):550. https://doi.org/10.3390/technologies13120550

Chicago/Turabian Style

Naidoo, Nathan, and Mehrdad Ghaziasgar. 2025. "A Multi-Platform Electronic Travel Aid Integrating Proxemic Sensing for the Visually Impaired" Technologies 13, no. 12: 550. https://doi.org/10.3390/technologies13120550

APA Style

Naidoo, N., & Ghaziasgar, M. (2025). A Multi-Platform Electronic Travel Aid Integrating Proxemic Sensing for the Visually Impaired. Technologies, 13(12), 550. https://doi.org/10.3390/technologies13120550

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Article metric data becomes available approximately 24 hours after publication online.
Back to TopTop