A Multi-Platform Electronic Travel Aid Integrating Proxemic Sensing for the Visually Impaired
Abstract
1. Introduction
2. Related Work (Abridged)
3. Theoretical Framework
4. Materials and Methods (Abridged)
4.1. ETA Architecture
4.2. Glasses Node
Object Classification Using COCO and YOLO
4.3. Belt Node
4.4. Cane Node
- c is the total distance from the Cane pivot axis point to the detected obstacle (m);
- is the IMU-measured attitude angle relative to the vertical axis (degrees);
- b is the horizontal ground distance between the user and the detected obstacle (m).
- is the expected distance from the Cane pivot axis point to the ground (m);
- h is the measured handle-to-ground height (m);
- is the IMU-measured attitude angle relative to the vertical axis (degrees).
- is the calculated height of the detected obstacle above the ground (m);
- is the difference between the expected distance and the measured LiDAR distance + the distance between the Cane’s handle and LiDAR sensor c (m);
- is the IMU-measured attitude angle relative to the vertical axis (degrees).
4.5. Input Sensor Plan Design
4.6. Output Sensor Deployment
4.7. Working Prototype Summary
5. Experimentation and Results
5.1. Testing Rig
5.2. Dynamic Sensor Testing Protocol
5.3. Dynamic Results
5.4. Object Classification Protocol
5.5. Object Classification Results
5.6. Feedback Latency Testing Protocol
5.7. Latency Results
5.7.1. Glasses Node Results
5.7.2. Belt Node Results
5.7.3. Cane Node
6. Discussion
- Extended user trials: Evaluate long-term usability, comfort, learning curves, and user adaptation with participants who have VI.
- Sensor expansion: Incorporate complementary modalities such as radar or thermal imaging to enhance detection robustness under adverse or low-visibility conditions.
- Hardware optimisation: Miniaturise and consolidate system components to improve wearability, energy efficiency, and commercial feasibility.
- Adaptive feedback: Develop context-aware feedback mechanisms that dynamically adjust modality, intensity, and timing according to environmental conditions and user preferences.
- Mobile integration: Offload AI processing and system monitoring tasks to smartphones or cloud-based services to facilitate real-world application.
7. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
Abbreviations
| ETA | Electronic Travel Aid |
| VI | Visual Impairment |
| OM | Orientation and Mobility |
| FoV | Field of View |
| MCU | Microcontroller Unit |
| SBC | Single Board Computer |
| IMU | Inertial Measurement Unit |
| UT | Ultrasonic Transducer |
| LiDAR | Light Detection and Ranging |
| RGB-D | Red–Green–Blue plus Depth camera |
| CV | Coefficient of Variation |
| SD | Standard Deviation |
| MS | Mean Square (statistical measure) |
| YOLOv8 | You Only Look Once version 8 (object detection algorithm) |
| ESP32 | Espressif 32-bit microcontroller family |
| MPU6050 | Microelectromechanical Inertial Measurement Unit |
| TFMINI Pro | Time-of-Flight Miniature LiDAR sensor |
References
- World Health Organization. Blindness and Vision Impairment; WHO: Geneva, Switzerland, 2024. [Google Scholar]
- Bruce, S.M.; Vargas, C. Teaching object permanence: An action research study. J. Vis. Impair. Blind. 2013, 107, 60–64. [Google Scholar] [CrossRef]
- Long, R.; Hill, E.W. Establishing and maintaining orientation for mobility. In Foundations of Orientation and Mobility; American Foundation for the Blind: New York, NY, USA, 1997; Volume 1. [Google Scholar]
- Hall, E.T. The Hidden Dimension; Doubleday: Garden City, NY, USA, 1966; Volume 609. [Google Scholar]
- Yu, Y.; Shi, Z.; Liu, X.; Tao, X. VisiGlasses: A Smart Assistive Glasses for Visually Impaired. In Proceedings of the ACM Turing Award Celebration Conference-China 2024, Changsha, China, 5–7 July 2024; pp. 244–245. [Google Scholar] [CrossRef]
- Almajdoub, R.A.; Shiba, O.S. An Assistant System For Blind To Avoid Obstacles Using Artificial Intelligence Techniques. Int. J. Eng. Inf. Technol. (IJEIT) 2024, 12, 226–238. [Google Scholar] [CrossRef]
- Abdulkareem, S.A.; Mohammed, H.I.; Mahdi, A.A. System for Visually Disabled Through Wearables Utilizing Arduino and Ultrasound. J. La Multiapp 2024, 5, 309–321. [Google Scholar] [CrossRef]
- Na, Q.; Zhou, H.; Yuan, H.; Gui, M.; Teng, H. Improving Walking Path Generation Through Biped Constraint in Indoor Navigation System for Visually Impaired Individuals. IEEE Trans. Neural Syst. Rehabil. Eng. 2024, 32, 1221–1232. [Google Scholar] [CrossRef] [PubMed]
- Chen, Y.; Shen, J.; Sawada, H. A wearable assistive system for the visually impaired using object detection, distance measurement and tactile presentation. Intell. Robot. 2023, 3, 420–435. [Google Scholar] [CrossRef]
- Almomani, A.; Alauthman, M.; Malkawi, A.; Shwaihet, H.; Aldigide, B.; Aldabeek, D.; Hamoodeh, K. Smart Shoes Safety System for the Blind People Based on (IoT) Technology. Comput. Mater. Contin. 2023, 76, 415–436. [Google Scholar] [CrossRef]
- Kammoun, S.; Bouaziz, R.; Saeed, F.; Qasem, N.; Al-Hadhrami, T. Haptisole: Wearable haptic system in vibrotactile guidance shoes for visually impaired wayfinding. KSII Trans. Internet Inf. Syst. 2023, 17, 3064–3082. [Google Scholar] [CrossRef]
- Bouteraa, Y. Smart real time wearable navigation support system for BVIP. Alex. Eng. J. 2023, 62, 223–235. [Google Scholar] [CrossRef]
- Katayama, D.; Ishii, K.; Yasukawa, S.; Nishida, Y.; Nakadomari, S. Fall Risk Estimation for Visually Impaired using iPhone with LiDAR. J. Robot. Netw. Artif. Life 2023, 9, 349–357. [Google Scholar] [CrossRef]
- Chaudary, B.; Pohjolainen, S.; Aziz, S.; Arhippainen, L.; Pulli, P. Teleguidance-based remote navigation assistance for visually impaired and blind people—Usability and user experience. Virtual Real. 2023, 27, 141–158. [Google Scholar] [CrossRef]
- Li, G.; Xu, J.; Li, Z.; Chen, C.; Kan, Z. Sensing and navigation of wearable assistance cognitive systems for the visually impaired. IEEE Trans. Cogn. Dev. Syst. 2022, 15, 122–133. [Google Scholar] [CrossRef]
- See, A.R.; Sasing, B.G.; Advincula, W.D. A smartphone-based mobility assistant using depth imaging for visually impaired and blind. Appl. Sci. 2022, 12, 2802. [Google Scholar] [CrossRef]
- Suman, S.; Mishra, S.; Sahoo, K.S.; Nayyar, A. Vision navigator: A smart and intelligent obstacle recognition model for visually impaired users. Mob. Inf. Syst. 2022, 2022, 9715891. [Google Scholar] [CrossRef]
- Joseph, E.C.; Chigozie, E.C.; Uche, J.I. Prototype Development of Hand Wearable RF Locator with Smart Walking Aid for the Blind. J. Eng. 2022, 19, 173–183. [Google Scholar]
- Hao, Y.; Feng, J.; Rizzo, J.; Wang, Y.; Fang, Y. Detect and Approach: Close-Range Navigation Support for People with Blindness and Low Vision. In Proceedings of the European Conference on Computer Vision, Tel Aviv, Israel, 23–27 October 2022; Springer: Cham, Switzerland, 2022; pp. 607–622. [Google Scholar] [CrossRef]
- Kilian, J.; Neugebauer, A.; Scherffig, L.; Wahl, S. The unfolding space glove: A wearable spatio-visual to haptic sensory substitution device for blind people. Sensors 2022, 22, 1859. [Google Scholar] [CrossRef] [PubMed]
- Chandankhede, P.; Kumar, A. Visually Impaired Aid using Computer Vision to read the obstacles. J. Algebr. Stat. 2022, 13, 4467–4481. [Google Scholar]
- Ali A., H.; Rao, S.U.; Ranganath, S.; Ashwin, T.S.; Reddy, G.R.M. A Google Glass Based Real-Time Scene Analysis for the Visually Impaired. IEEE Access 2021, 9, 166351–166369. [Google Scholar] [CrossRef]
- Tachiquin, R.; Velázquez, R.; Del-Valle-Soto, C.; Gutiérrez, C.A.; Carrasco, M.; De Fazio, R.; Trujillo-León, A.; Visconti, P.; Vidal-Verdú, F. Wearable urban mobility assistive device for visually impaired pedestrians using a smartphone and a tactile-foot interface. Sensors 2021, 21, 5274. [Google Scholar] [CrossRef]
- Jubril, A.M.; Samuel, S.J. A multisensor electronic traveling aid for the visually impaired. Technol. Disabil. 2021, 33, 99–107. [Google Scholar] [CrossRef]
- Barontini, F.; Catalano, M.G.; Pallottino, L.; Leporini, B.; Bianchi, M. Integrating Wearable Haptics and Obstacle Avoidance for the Visually Impaired in Indoor Navigation: A User-Centered Approach. IEEE Trans. Haptics 2021, 14, 109–122. [Google Scholar] [CrossRef]
- Yang, J.; Yang, Y.; Guo, C.; Zhang, H.; Yin, H.; Yan, W. Design and Implementation of Smart Glasses for the Blind Based on Raspberry Pi. Comput. Knowl. Technol. 2021, 17, 85–87. [Google Scholar]
- Hsieh, I.; Cheng, H.; Ke, H.; Chen, H.; Wang, W. A CNN-based wearable assistive system for visually impaired people walking outdoors. Appl. Sci. 2021, 11, 10026. [Google Scholar] [CrossRef]
- Romadhon, A.S.; Husein, A.K. Smart Stick For the Blind Using Arduino. J. Phys. Conf. Ser. 2020, 1569, 032088. [Google Scholar] [CrossRef]
- Yang, C.; Jung, J.; Kim, J. Development of Obstacle Detection Shoes for Visually Impaired People. Sens. Mater. 2020, 32, 2227–2236. [Google Scholar] [CrossRef]
- Messaoudi, M.D.; Menelas, B.J.; Mcheick, H. Autonomous smart white cane navigation system for indoor usage. Technologies 2020, 8, 37. [Google Scholar] [CrossRef]
- Hakim, H.; Fadhil, A. Navigation system for visually impaired people based on RGB-D camera and ultrasonic sensor. In Proceedings of the International Conference on Information and Communication Technology, Baghdad, Iraq, 15–16 April 2019; pp. 172–177. [Google Scholar] [CrossRef]
- Petsiuk, A.L.; Pearce, J.M. Low-cost open source ultrasound-sensing based navigational support for the visually impaired. Sensors 2019, 19, 3783. [Google Scholar] [CrossRef]
- Bai, J.; Liu, Z.; Lin, Y.; Li, Y.; Lian, S.; Liu, D. Wearable travel aid for environment perception and navigation of visually impaired people. Electronics 2019, 8, 697. [Google Scholar] [CrossRef]
- Ahmed, A.H. Design of Navigation System for Visually Impaired People. Ph.D. Thesis, Near East University, Nicosia, Cyprus, 2019. [Google Scholar]
- Kaur, B.; Bhattacharya, J. Scene perception system for visually impaired based on object detection and classification using multimodal deep convolutional neural network. J. Electron. Imaging 2019, 28, 013031. [Google Scholar] [CrossRef]
- Younis, O.; Al-Nuaimy, W.; Rowe, F.; Alomari, M.H. A smart context-aware hazard attention system to help people with peripheral vision loss. Sensors 2019, 19, 1630. [Google Scholar] [CrossRef]
- Khan, N.S.; Kundu, S.; Al Ahsan, S.; Sarker, M.; Islam, M.N. An Assistive System of Walking for Visually Impaired. In Proceedings of the 2018 International Conference on Computer, Communication, Chemical, Material and Electronic Engineering (IC4ME2), Rajshahi, Bangladesh, 8–9 February 2018; IEEE: Piscataway, NJ, USA, 2018; pp. 1–4. [Google Scholar] [CrossRef]
- Jafri, R.; Khan, M.M. User-centered design of a depth data based obstacle detection and avoidance system for the visually impaired. Hum.-Centric Comput. Inf. Sci. 2018, 8, 1–30. [Google Scholar] [CrossRef]
- Katzschmann, R.K.; Araki, B.; Rus, D. Safe local navigation for visually impaired users with a time-of-flight and haptic feedback device. IEEE Trans. Neural Syst. Rehabil. Eng. 2018, 26, 583–593. [Google Scholar] [CrossRef]
- Elmannai, W.M.; Elleithy, K.M. A highly accurate and reliable data fusion framework for guiding the visually impaired. IEEE Access 2018, 6, 33029–33054. [Google Scholar] [CrossRef]
- Cardillo, E.; Di Mattia, V.; Manfredi, G.; Russo, P.; De Leo, A.; Caddemi, A.; Cerri, G. An electromagnetic sensor prototype to assist visually impaired and blind people in autonomous walking. IEEE Sens. J. 2018, 18, 2568–2576. [Google Scholar] [CrossRef]
- Kumar, M.; Kabir, F.; Roy, S. Low Cost Smart Stick for Blind and Partially Sighted People. Int. J. Adv. Eng. Manag. 2017, 2, 65–68. [Google Scholar] [CrossRef]
- Buchs, G.; Simon, N.; Maidenbaum, S.; Amedi, A. Waist-up protection for blind individuals using the EyeCane as a primary and secondary mobility aid. Restor. Neurol. Neurosci. 2017, 35, 225–235. [Google Scholar] [CrossRef] [PubMed]
- Rao, A.S.; Gubbi, J.; Palaniswami, M.; Wong, E. A Vision-based System to Detect Potholes and Uneven Surfaces for Assisting Blind People. In Proceedings of the 2016 IEEE International Conference on Communications (ICC), Kuala Lumpur, Malaysia, 22–27 May 2016; IEEE: Piscataway, NJ, USA, 2016; pp. 1–6. [Google Scholar] [CrossRef]
- Saffoury, R.; Blank, P.; Sessner, J.; Groh, B.H.; Martindale, C.F.; Dorschky, E.; Franke, J.; Eskofier, B.M. Blind Path Obstacle Detector Using Smartphone Camera and Line Laser Emitter. In Proceedings of the 2016 1st International Conference on Technology and Innovation in Sports, Health and Wellbeing (TISHW), Vila Real, Portugal, 1–3 December 2016; IEEE: Piscataway, NJ, USA, 2016; pp. 1–7. [Google Scholar] [CrossRef]
- Stoll, C.; Palluel-Germain, R.; Fristot, V.; Pellerin, D.; Alleysson, D.; Graff, C. Navigating from a depth image converted into sound. Appl. Bionics Biomech. 2015, 2015, 543492. [Google Scholar] [CrossRef]
- Gao, Y.; Chandrawanshi, R.; Nau, A.C.; Tse, Z.T.H. Wearable virtual white cane network for navigating people with visual impairment. Proc. Inst. Mech. Eng. Part H J. Eng. Med. 2015, 229, 681–688. [Google Scholar] [CrossRef]
- Kellman, P.J.; Arterberry, M.E. The Cradle of Knowledge: Development of Perception in Infancy; MIT Press: Cambridge, MA, USA, 2000. [Google Scholar]
- Fieandt, K.V.J.v.; Korkala, P.Y.; West, L.J.; Järvinen, E.J. Space Perception. Encyclopædia Britannica. 2017. Available online: https://www.britannica.com/topic/space-perception (accessed on 17 June 2024).
- Thinus-Blanc, C.; Gaunet, F. Representation of space in blind persons: Vision as a spatial sense? Psychol. Bull. 1997, 121, 20. [Google Scholar] [CrossRef]
- Hersh, M.A. Designing assistive technology to support independent travel for blind and visually impaired people. In Proceedings of the CVHI’09: Conference and Workshop on Assistive Technologies for People with Vision and Hearing Impairments, Wroclaw, Poland, 20–23 April 2009. [Google Scholar]
- Hersh, M.; Johnson, M.A. Assistive Technology for Visually Impaired and Blind People; Springer Science & Business Media: Cham, Switzerland, 2010. [Google Scholar] [CrossRef]
- Campbell, S.; O’Mahony, N.; Krpalcova, L.; Riordan, D.; Walsh, J.; Murphy, A.; Ryan, C. Foundations of Orientation and Mobility, Volume 1, History and Theory, 3rd ed.; AFB Press, American Foundation for the Blind: New York, NY, USA, 2010. [Google Scholar]
- Hollins, M. Understanding Blindness: An Integrative Approach; Lawrence Erlbaum Associates, Inc.: Mahwah, NJ, USA, 1989. [Google Scholar]
- Watson, C. The Somatosensory System. In The Mouse Nervous System; Elsevier: Amsterdam, The Netherlands, 2012; pp. 563–570. [Google Scholar]
- Buck, L.B.; Bargmann, C. Smell and Taste: The Chemical Senses. Princ. Neural Sci. 2000, 4, 625–647. [Google Scholar]
- Pasic, R.; Kuzmanov, I.; Atanasovski, K. ESP-NOW communication protocol with ESP32. J. Univers. Excell. 2021, 6, 53–60. [Google Scholar] [CrossRef]
- Postel, J. RFC 768; User Datagram Protocol; Internet Engineering Task Force: Wilmington, DE, USA, 1980. [Google Scholar] [CrossRef]
- DFRobot. ESP32-CAM Development Board, SKU: DFR0602. 2025. Available online: https://www.dfrobot.com/product-2047.html (accessed on 22 September 2025).
- Makersguides. JSN-SR04T-2.0 Ultrasonic Sensor Module. 2025. Available online: https://makersguides.com/jsn-sr04t-2-0-ultrasonic-sensor/ (accessed on 22 September 2025).
- Redmon, J.; Divvala, S.; Girshick, R.; Farhadi, A. You only look once: Unified, real-time object detection. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Las Vegas, NV, USA, 27–30 June 2016; pp. 779–788. [Google Scholar] [CrossRef]
- Jocher, G.; Chaurasia, A.; Stoken, A.; Borovec, J.; Kwon, Y.; Michael, K.; Fang, J.; Zeng, Y.; Wong, C.; Montes, D.; et al. ultralytics/yolov5: V7.0—YOLOv5 SOTA Realtime Instance Segmentation. Zenodo. 2022. Available online: https://zenodo.org/records/7347926 (accessed on 22 September 2025).
- Bochkovskiy, A.; Wang, C.Y.; Liao, H.Y.M. YOLOv4: Optimal speed and accuracy of object detection. arXiv 2020, arXiv:2004.10934. [Google Scholar] [CrossRef]
- Sohan, M.; Sai Ram, T.; Reddy, R.; Venkata, C. A review on YOLOv8 and its advancements. In Proceedings of the International Conference on Data Intelligence and Cognitive Informatics, Tirunelveli, India, 27–28 June 2024; pp. 529–545. [Google Scholar] [CrossRef]
- Lin, T.; Maire, M.; Belongie, S.; Hays, J.; Perona, P.; Ramanan, D.; Dollár, P.; Zitnick, C.L. Microsoft COCO: Common Objects in Context. In Proceedings of the European Conference on Computer Vision, Zurich, Switzerland, 6–12 September 2014; pp. 740–755. [Google Scholar] [CrossRef]
- Benewake. TFmini Plus LiDAR Module. SJ-PM-TFmini. 2025. Available online: https://en.benewake.com/TFminiPlus/index.html (accessed on 23 November 2025).
- Knoblauch, R.L.; Pietrucha, M.T.; Nitzburg, M. Field Studies of Pedestrian Walking Speed and Start-Up Time. Transp. Res. Rec. J. Transp. Res. Board 1996, 1538, 27–38. [Google Scholar] [CrossRef]
- Bohannon, R.W. Comfortable and Maximum Walking Speed of Adults Aged 20–79 Years: Reference Values and Determinants. Age Ageing 1997, 26, 15–19. [Google Scholar] [CrossRef]
- Technology, H. L298N Dual H-Bridge Motor Driver; Handson Technology: Gurgaon, India, 2025. [Google Scholar]
- Kim, Y.; Moncada-Torres, A.; Furrer, J.; Riesch, M.; Gassert, R. Quantification of long cane usage characteristics with the constant contact technique. Appl. Ergon. 2016, 55, 216–225. [Google Scholar] [CrossRef]










































| Input Modality Type | Sum of Sensors | Sum of Distinct Input Modalities | Modality Technology Deployed | Sum of Distinct Region Placement | Region Placement | ETA Count | References |
|---|---|---|---|---|---|---|---|
| Proximity | 1 | 1 | Ultrasonic | 1 | Arm | 1 | [32] |
| Proximity | 1 | 1 | LiDAR | 1 | Chest | 1 | [35] |
| Proximity | 1 | 1 | GPS | 1 | Foot | 2 | [11,23] |
| Proximity | 1 | 1 | LiDAR | 1 | Hand | 1 | [8] |
| Proximity | 1 | 1 | Ultrasonic | 1 | Hand | 3 | [18,28,30] |
| Proximity | 1 | 1 | Radar | 1 | Hand | 1 | [41] |
| Proximity | 1 | 1 | Ultrasonic | 1 | Waist | 1 | [37] |
| Image | 1 | 1 | RGBD-Camera | 1 | Head | 3 | [27,33,36] |
| Image | 1 | 1 | RGB-Camera | 1 | Head | 3 | [5,22,46] |
| Image | 1 | 1 | RGBD-Camera | 1 | Hand | 2 | [20,38] |
| Image | 1 | 1 | RGB-Camera | 1 | Hand | 3 | [21,44,45] |
| Image | 1 | 1 | RGB-Stereo Camera | 1 | Chest | 2 | [9,40] |
| Image | 1 | 1 | RGBD-Camera | 1 | Chest | 2 | [25,26] |
| Image | 1 | 1 | RGB-Camera | 1 | Chest | 4 | [13,14,16,19] |
| Proximity | 2 | 1 | Infrared | 1 | Hand | 1 | [43] |
| Proximity | 2 | 1 | Infrared | 1 | Foot | 1 | [29] |
| Proximity | 2 | 1 | Ultrasonic | 2 | Hand, Foot | 1 | [17] |
| Proximity | 2 | 1 | Ultrasonic | 2 | Head, Hand | 1 | [34] |
| Proximity | 2 | 2 | Ultrasonic, Infrared | 1 | Hand | 1 | [42] |
| Image, Proximity | 2 | 2 | RGBD-Camera, Ultrasonic | 2 | Head, Leg | 1 | [15] |
| Image, Proximity | 2 | 2 | RGBD-Camera, Ultrasonic | 2 | Hand, Foot | 1 | [17] |
| Image, Proximity | 2 | 2 | RGB-Camera, Ultrasonic | 1 | Head | 1 | [31] |
| Image, Proximity | 2 | 2 | RGB-Camera, Ultrasonic | 1 | Waist | 1 | [31] |
| Proximity | 3 | 1 | Ultrasonic | 1 | Foot | 1 | [10] |
| Proximity | 3 | 2 | Ultrasonic, Infrared | 1 | Head, Hand | 1 | [7] |
| Proximity | 4 | 1 | Ultrasonic | 2 | Head, Hand | 1 | [12] |
| Proximity | 4 | 1 | Ultrasonic | 3 | Waist, Hand, Ankle | 1 | [47] |
| Image, Proximity | 4 | 2 | RGB-Camera, Ultrasonic | 2 | Head, Hand | 1 | [6] |
| Proximity | 7 | 1 | LiDAR | 1 | Waist | 1 | [39] |
| Processing Hardware | Processing Hardware Type | ETA Count | References |
|---|---|---|---|
| Arduino Uno | MCU | 9 | [6,7,17,18,28,34,37,41,47] |
| Arduino Nano | MCU | 2 | [7,32] |
| ESP32 | MCU | 1 | [30] |
| Mbed LPC1768 | MCU | 1 | [39] |
| Infineon XMC4500 | MCU | 1 | [41] |
| Bespoke MCU Device | MCU | 4 | [23,40,43,45] |
| Raspberry Pi 4 | SBC | 4 | [12,20,21,26] |
| Raspberry Pi 3 | SBC | 3 | [24,30,31] |
| Raspberry Pi Zero | SBC | 1 | [9] |
| NVIDIA Jetson Xavier/Nano | SBC | 3 | [8,19,27] |
| Odroid XU4 | SBC | 1 | [35] |
| NVIDIA Xavier NX | SBC | 1 | [8] |
| Raspberry Pi (unspecified) | SBC | 2 | [9,17] |
| Linux Computer | Computer/Laptop | 1 | [44] |
| ASUS Laptop/UX310U | Computer/Laptop | 2 | [25,46] |
| MacBook Laptop | Computer/Laptop | 1 | [36] |
| Google Tango Tablet | Computer/Laptop | 1 | [38] |
| Samsung Galaxy (A80, S5, S9) | Phone | 3 | [16,23,45] |
| OnePlus/Android Phones | Phone | 3 | [14,22,33] |
| iPhone 12 Pro | Phone | 1 | [13] |
| Not mentioned (Android Mobile Device) | Phone | 5 | [11,14,29,33,42] |
| INMO Air2 | Commercial Glasses | 1 | [5] |
| Moverio BT-200 | Commercial Glasses | 1 | [36] |
| Google Glasses | Commercial Glasses | 1 | [22] |
| Modality Type | Feedback Type | ETA Count | References |
|---|---|---|---|
| Auditory | Non-speech | 8 | [19,21,27,30,33,34,35,40] |
| Auditory | Speech | 15 | [5,6,10,11,13,18,20,22,23,24,26,29,31,32,38] |
| Tactus | Pressure & Vibration | 2 | [9,43] |
| Tactus | Vibration | 5 | [8,11,20,23,44] |
| Tactus + Auditory | Force Feedback & Speech | 1 | [8] |
| Tactus + Auditory | Speech & Non-speech | 2 | [14,25] |
| Tactus + Auditory | Vibration & Non-speech | 5 | [12,14,16,41,47] |
| Tactus + Auditory | Vibration & Speech | 5 | [15,28,39,42,46] |
| Node | Sensor | Detection Range (m) | Rig Starting Distance (m) | Accuracy Band [Min, Max] (m) |
|---|---|---|---|---|
| Belt | LiDAR 1 & 2 (Personal Space) | 1.0 | 2.0 | [0.9, 1.1] |
| Glasses | Image (Near Space) | 2.0 | 3.0 | [1.8, 2.2] |
| Glasses | Ultrasonic (Near Space) | 3.0 | 4.0 | [2.7, 3.3] |
| Cane | LiDAR 1 (Near Space) | 3.0 | 4.0 | [2.7, 3.3] |
| Glasses | Image (Far Space) | 4.0 | 5.0 | [3.6, 4.4] |
| Reference | Key Features | Deployed Body Region (s) | Field of View (FoV) | Spatial Coverage | |||
|---|---|---|---|---|---|---|---|
| Forward | Rear | Head | Torso | Legs | |||
| This Study | Multi-sensor, multi-modal tech., multi-region (LiDAR, Ultrasonic, Vision) | Head, Waist, Hand | ✓ | ✓ | ✓ | ✓ | ✓ |
| [15] | Multi-sensor, multi-modal tech., multi-region (RGBD, Ultrasonic) | Head, Leg | ✓ | ✓ | ✓ | ||
| [12] | Multi-sensor, multi-region (Ultrasonic) | Head, Hand | ✓ | ✓ | ✓ | ||
| [39] | Multi-sensor (LiDAR) | Waist | ✓ | ✓ | ✓ | ✓ | |
| [47] | Multi-region (Ultrasonic) | Waist, Hand, Ankle | ✓ | ✓ | ✓ | ||
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Naidoo, N.; Ghaziasgar, M. A Multi-Platform Electronic Travel Aid Integrating Proxemic Sensing for the Visually Impaired. Technologies 2025, 13, 550. https://doi.org/10.3390/technologies13120550
Naidoo N, Ghaziasgar M. A Multi-Platform Electronic Travel Aid Integrating Proxemic Sensing for the Visually Impaired. Technologies. 2025; 13(12):550. https://doi.org/10.3390/technologies13120550
Chicago/Turabian StyleNaidoo, Nathan, and Mehrdad Ghaziasgar. 2025. "A Multi-Platform Electronic Travel Aid Integrating Proxemic Sensing for the Visually Impaired" Technologies 13, no. 12: 550. https://doi.org/10.3390/technologies13120550
APA StyleNaidoo, N., & Ghaziasgar, M. (2025). A Multi-Platform Electronic Travel Aid Integrating Proxemic Sensing for the Visually Impaired. Technologies, 13(12), 550. https://doi.org/10.3390/technologies13120550

