Assistive Devices: Technology Development for the Visually Impaired
Abstract
:1. Introduction
- Ultraviolet (UV) radiation monitoring and alerts module;
- Geolocation module—GPS (used for device tracking);
- Obstacle detection module;
- Fall detection and warning module (designed for DP and elderly people);
- Object detection module in controlled environments;
- Mobile and web interfaces.
Literature Survey
2. Materials and Methods
2.1. Smart Cane
- —
- Obstacle detection: consists of ultrasonic sensors that detect obstacles at distances of up to 3 m. When the obstacle is detected, an alert is transmitted through a vibrotactile system (vibrating bracelet) connected to the stick sleeve. It is important to note that the closer the DP is to the obstacle/object, the greater the bracelet’s vibration (vibration motor of the bracelet).
- —
- Geolocation: the module embedded in the stick is a function that continues to act since its operations are initialized. After initializing, the module takes an average of 20 s to obtain the DP’s location coordinates. When obtained, the coordinates are sent to the MQTT (Message Queuing Telemetry Transport) Broker in message format, containing the stick and user IDs. In the initial experiments, the module was programmed to transmit every 10 s; however, this parameter can be changed at any time. In addition, the information is stored in a database (SQLite) so that, later, the DP’s displacement can be represented at the interface level.
- —
- Navigation: it is performed with the detection and recognition of color lines (red, green, blue and black) that signal the different paths followed by the DP. When a color is detected, the DP is alerted (audio) to decide which course (color) to follow. Color lines are widely used in public and private offices, health centers, and other places. In countries such as Peru and Brazil, this procedure is regulated by law.
- —
- Alert generation: the detection and navigation information are translated into real-time audio.
2.2. Smart Glove
- —
- Interface Module: consists of an RFID tag, which acts as an interface with a device known as the smart module, whose function is to present information about the location (room, laboratory, building, etc.) to the DP. The device was installed at the access (doors and near the doors) of the electronic engineering building and the laboratories’ entrance. In its first version, the module presents a straightforward function. In anSQLite database, it stores audio information about objects and people registered within the environment. This information can then be accessed by the DP using an RFID tag integrated with the SG (Figure 3). When DPs bring the tag closer to the smart module, it presents a menu with audio information about objects and people registered in the environment. The expectation is that the smart module will be automatically powered and updated with information about when people enter and leave the environment and when objects are placed or removed. Therefore, the DB (Data Base) must be updated in real-time.
- —
- Identification Module: consists of an infrared (IR) circuit designed to operate as a receiver IRrx (glove) and transmitter IRtx (object). The IR circuit is connected to a microcontroller that runs the detection and translation algorithm of the object’s information. Operationally, the IRrx on the glove communicates with the IRtx on the object, and this occurs when users place their hands on these objects. The IRtx sends its identification (object code), and the IRrx searches for it in its internal database and translates the object’s information into audio to be heard by the DP (Figure 4).
2.3. Smart Cap
- —
- Falls control module: designed to identify whether DPs have suffered a fall, especially sudden falls. The module uses a LilyPad ADXL 335 sensor, which is intended for use in wearable applications. This sensor has three terminals for each of its measurement axes, and the slope level determines the voltage delivered for each terminal. When the module identifies the user’s fall, it triggers a message on the GPRS module, sending the fall’s location and intensity. This message is forwarded, in real-time, to friends or family members previously registered.
- —
- Detection module: consists of HC-SR04 sensors and a vibrotactile system composed of a LilyPad vibrating micromotor, which issues alerts as the identification of objects is confirmed. The intensity of the generated alert is programmed as DPs approach the obstacle. The module detects objects and obstacles up to 3 m away and issues a warning from 1.5 m away (Table 2). The device acts as a complement to the Smart Cane.
- —
- UV module: designed using the ML851 UV sensor integrated with Arduino Nano. The sensor allows DPs to receive information about radiation levels, and a buzzer emits an alert instructing the user to use sun protection products.
- —
- Geolocation Module: consists of an A9G (GSM-GPRS) embedded in the device, which is a web monitoring environment that uses, in its first version, the Google Maps API. The module does not work as a navigation assistant but as a device for locating DPs inside and outside the university premises.
3. Results
3.1. Hardware Design
3.1.1. Smart Cane—Architecture
3.1.2. Smart Glove—Architecture
3.1.3. Smart Cap—Architecture
3.2. Experiments and Functionality Testing
3.2.1. Smart Cane—Functions
3.2.2. Smart Glove—Functions
3.2.3. Smart Cap—Functions
3.2.4. Usability Experiments
- (a)
- Number of tasks that can be performed by the device;
- (b)
- Function learning time;
- (c)
- Percentage of tasks completed on the first attempt;
- (d)
- Number of assistance requests;
- (e)
- Time spent on first use attempt;
- (f)
- Level of satisfaction;
- (g)
- Level of expectations.
4. Discussion
5. Conclusions
Author Contributions
Funding
Data Availability Statement
Acknowledgments
Conflicts of Interest
Appendix A. The Connection Diagram and the Graphical Mapping of the Components
References
- Hussein, A.I. Wearable computing: Challenges of implementation and its future. In Proceedings of the 2015 12th Learning and Technology Conference, Jeddah, Saudi Arabia, 12–13 April 2015; pp. 14–19. [Google Scholar] [CrossRef]
- Nižetić, S.; Šolić, P.; González-De-Artaza, D.L.-D.; Patrono, L. Internet of Things (IoT): Opportunities, issues and challenges towards a smart and sustainable future. J. Clean. Prod. 2020, 274, 122877. [Google Scholar] [CrossRef] [PubMed]
- Borges, L.M.; Rente, A.; Velez, F.J.; Salvado, L.R.; Lebres, A.S.; Oliveira, J.M.; Araujo, P.; Ferro, J. Overview of progress in Smart-Clothing project for health monitoring and sport applications. In Proceedings of the 2008 First International Symposium on Applied Sciences on Biomedical and Communication Technologies, Aalborg, Denmark, 25–28 October 2008; pp. 1–6. [Google Scholar] [CrossRef]
- Singha, K.; Kumar, J.; Pandit, P. Recent Advancements in Wearable & Smart Textiles: An Overview. Mater. Today Proc. 2019, 16, 1518–1523. [Google Scholar] [CrossRef]
- Shahraki, A.A. Urban planning for physically disabled people’s needs with case studies. Spat. Inf. Res. 2021, 29, 173–184. [Google Scholar] [CrossRef]
- Yilmaz, M. Public Space and Accessibility. ICONARP Int. J. Archit. Plan. 2018, 6, 1–14. [Google Scholar] [CrossRef]
- Poldma, T.; Labbe, D.; Bertin, S.; De Grosbois, È.; Barile, M.; Mazurik, K.; Desjardins, M.; Herbane, H.; Artis, G. Understanding people’s needs in a commercial public space: About accessibility and lived experience in social settings. Alter 2014, 8, 206–216. [Google Scholar] [CrossRef] [Green Version]
- Lau, B.P.L.; Marakkalage, S.H.; Zhou, Y.; Hassan, N.U.; Yuen, C.; Zhang, M.; Tan, U.-X. A survey of data fusion in smart city applications. Inf. Fusion 2019, 52, 357–374. [Google Scholar] [CrossRef]
- Mehta, U.; Alim, M.; Kumar, S. Smart Path Guidance Mobile Aid for Visually Disabled Persons. Procedia Comput. Sci. 2017, 105, 52–56. [Google Scholar] [CrossRef]
- Manjari, K.; Verma, M.; Singal, G. A survey on Assistive Technology for visually impaired. Internet Things 2020, 11, 100188. [Google Scholar] [CrossRef]
- Mekhalfi, M.L.; Melgani, F.; Zeggada, A.; De Natale, F.G.; Salem, M.A.-M.; Khamis, A. Recovering the sight to blind people in indoor environments with smart technologies. Expert Syst. Appl. 2016, 46, 129–138. [Google Scholar] [CrossRef] [Green Version]
- Lee, C.-W.; Chondro, P.; Ruan, S.-J.; Christen, O.; Naroska, E. Improving mobility for the visually impaired: A wearable indoor positioning system based on visual markers. IEEE Consum. Electron. Mag. 2018, 7, 12–20. [Google Scholar] [CrossRef]
- Martinez-Sala, A.S.; Losilla, F.; Sánchez-Aarnoutse, J.C.; García-Haro, J. Design, implementation and evaluation of an indoor navigation system for visually impaired people. Sensors 2015, 15, 32168–32187. [Google Scholar] [CrossRef] [PubMed]
- Dim, N.K.; Kim, K.; Ren, X. Designing motion marking menus for people with visual impairments. Int. J. Hum. Comput. Stud. 2018, 109, 79–88. [Google Scholar] [CrossRef]
- Bai, J.; Liu, D.; Su, G.; Fu, Z. A cloud and vision-based navigation system used for blind people. In Proceedings of the 2017 International Conference on Artificial Intelligence, Automation and Control Technologies (AIACT ‘17), Wuhan, China, 7–9 April 2017; p. 22. [Google Scholar]
- Katzschmann, R.; Araki, B.; Rus, D. Safe local navigation for visually impaired users with a time-of-flight and haptic feedback device. IEEE Trans. Neural Syst. Rehabil. Eng. 2018, 26, 583–593. [Google Scholar] [CrossRef] [PubMed]
- Götzelmann, T. Lucentmaps: 3D printed audiovisual tactile maps for blind and visually impaired people. In Proceedings of the 18th International ACM SIGACCESS Conference on Computers and Accessibility (ASSETS ‘16), Reno, NV, USA, 24–26 October 2016; pp. 81–90. [Google Scholar]
- Chung, I.Y.; Kim, S.; Rhee, K.H. The smart cane utilizing a smart phone for the visually impaired person. In Proceedings of the 2014 IEEE 3rd Global Conference on Consumer Electronics (GCCE), Tokyo, Japan, 7–10 October 2014; pp. 106–107. [Google Scholar] [CrossRef]
- Subbiah, S.; Ramya, S.; Krishna, G.P.; Nayagam, S. Smart Cane for Visually Impaired Based On IOT. In Proceedings of the 2019 3rd International Conference on Computing and Communications Technologies (ICCCT), Chennai, India, 21–22 February 2019; pp. 50–53. [Google Scholar] [CrossRef]
- Salat, S.; Habib, M.A. Smart Electronic Cane for the Assistance of Visually Impaired People. In Proceedings of the 2019 IEEE International WIE Conference on Electrical and Computer Engineering (WIECON-ECE), Bangalore, India, 15–16 November 2019; pp. 1–4. [Google Scholar] [CrossRef]
- Zhangaskanov, D.; Zhumatay, N.; Ali, M.H. Audio-based Smart White Cane for Visually Impaired People. In Proceedings of the 2019 5th International Conference on Control, Automation and Robotics (ICCAR), Beijing, China, 19–22 April 2019; pp. 889–893. [Google Scholar] [CrossRef]
- Hapsari, G.I.; Mutiara, G.A.; Kusumah, D.T. Smart cane location guide for blind using GPS. In Proceedings of the 2017 5th International Conference on Information and Communication Technology (ICoIC7), Melaka, Malaysia, 17–19 May 2017; pp. 1–6. [Google Scholar] [CrossRef]
- Nandini, A.V.; Dwivedi, A.; Kumar, N.A.; Ashwin, T.S.; Vishnuvardhan, V.; Guddeti, R.M.R. Smart Cane for Assisting Visually Impaired People. In Proceedings of the TENCON 2019-2019 IEEE Region 10 Conference (TENCON), Kochi, India, 17–20 October 2019; pp. 546–551. [Google Scholar] [CrossRef]
- Rahman, A.; Malia, K.F.N.; Mia, M.M.; Shuvo, A.S.M.M.H.; Nahid, M.H.; Zayeem, A.T.M.M. An Efficient Smart Cane Based Navigation System for Visually Impaired People. In Proceedings of the 2019 International Symposium on Advanced Electrical and Communication Technologies (ISAECT), Rome, Italy, 27–29 November 2019; pp. 1–6. [Google Scholar] [CrossRef]
- Saaid, M.F.; Mohammad, A.M.; Ali, M.S.A.M. Smart cane with range notification for blind people. In Proceedings of the 2016 IEEE International Conference on Automatic Control and Intelligent Systems (I2CACIS), Selangor, Malaysia, 22–22 October 2016; pp. 225–229. [Google Scholar] [CrossRef]
- Sharma, T.; Nalwa, T.; Choudhury, T.; Satapathy, S.C.; Kumar, P. Smart Cane: Better Walking Experience for Blind People. In Proceedings of the 2017 3rd International Conference on Computational Intelligence and Networks (CINE), Odisha, India, 28 October 2017; pp. 22–26. [Google Scholar] [CrossRef]
- SathyaNarayanan, E.; Gokul, D.D.; Nithin, B.P.; Vidhyasagar, P. IoT based smart walking cane for typhlotic with voice assistance. In Proceedings of the 2016 Online International Conference on Green Engineering and Technologies (IC-GET), Coimbatore, India, 19 November 2016; pp. 1–6. [Google Scholar] [CrossRef]
- Murali, S.; Shrivatsan, R.; Sreenivas, V.; Vijjappu, S.; Gladwin, S.J.; Rajavel, R. Smart walking cane for the visually challenged. In Proceedings of the 2016 IEEE Region 10 Humanitarian Technology Conference (R10-HTC), Agra, India, 21–23 December 2016; pp. 1–4. [Google Scholar] [CrossRef]
- Ashraf, M.M.; Hasan, N.; Lewis, L.; Hasan, M.R.; Ray, P. A systematic literature review of the application of information communication technology for visually impaired people. Int. J. Disabil. Manag. 2016, 11, 1–18. [Google Scholar] [CrossRef] [Green Version]
- Jafri, R.; Campos, R.L.; Ali, S.A.; Arabnia, H.R. Visual and infrared sensor data-based obstacle detection for the visually impaired using the google project tango tablet development kit and the unity engine. IEEE Access 2018, 6, 443–454. [Google Scholar] [CrossRef]
- Spoladore, D.; Arlati, S.; Carciotti, S.; Nolich, M.; Sacco, M. RoomFort: An Ontology-Based Comfort Management Application for Hotels. Electronics 2018, 7, 345. [Google Scholar] [CrossRef] [Green Version]
- Assistive Technology: Definition and Safe Use. Available online: https://www.gov.uk/government/publications/assistive-technology-definition-and-safe-use/assistive-technology-definition-and-safe-use (accessed on 6 June 2021).
- Rebernik, N.; Szajczyk, M.; Bahillo, A.; Goličnik Marušić, B. Measuring Disability Inclusion Performance in Cities Using Disability Inclusion Evaluation Tool (DIETool). Sustainability 2020, 12, 1378. [Google Scholar] [CrossRef] [Green Version]
- Rendulich, J.; Beingolea, J.R.; Zegarra, M.; Vizcarra, I.G.G.; Kofuji, S.T. An IoT Environment for the Development of Assistive Applications in Smart Cities. In Proceedings of the 2019 IEEE 1st Sustainable Cities Latin America Conference (SCLA), Arequipa, Peru, 26–29 August 2019; pp. 1–4. [Google Scholar] [CrossRef]
- Puli, L.; Layton, N.; Mont, D.; Shae, K.; Calvo, I.; Hill, K.D.; Callaway, L.; Tebbutt, E.; Manlapaz, A.; Groenewegen, I.; et al. Assistive Technology Provider Experiences during the COVID-19 Pandemic. Int. J. Environ. Res. Public Health 2021, 18, 10477. [Google Scholar] [CrossRef]
Disability | Application | Sensors | Function | Platform | Technology | Reference |
---|---|---|---|---|---|---|
Visual | Object Identification | Accelerometer Gyroscope | Object recognition and navigation (Indoor) | Not Specified | Wearable | [11] |
Visual | Object Identification | Camera module Ultrasonic | Identification of objects using markers (Indoor) | Embedded | Wearable | [12] |
Visual | Navigation (Indoor) | GPS (Smartphone) | Indoor navigation | Mobile | Machine Learning | [13] |
Visual | Menu Selection | Accelerometer (Smartphone) | Menu selection in mobile device applications | Mobile | Motion Marking Menus | [14] |
Visual | Navigation | Camera module/Microphone, Speaker (Smartphone) | Object navigation, recognition and detection | Cloud and Mobile | Cloud (micro-services) and Image Recognition | [15] |
Visual | Navigation | Ultrasonic, vibration motors, IR Sensor | Navigation in indoor environments | Not Specified | Wearable | [16] |
Visual | Object Identification | 3D Printing | Tactile maps for objects identification | 3D Printing | [17] | |
Visual | Navigation (Smart Cane) | Ultrasonic, Gyroscope and Smartphone | Smart Cane guided navigation integrated with a mobile API | Embedded and Mobile | IoT | [18] |
Visual | Navigation (Smart Cane) | IR Sensor, GPS, Ultrasonic, Camera module, buzzer, raspberry | Navigation, identification of obstacles, objects and events | Embedded | IoT and Image Recognition | [19] |
Visual | Navigation (Smart Cane) | Ultrasonic, Speaker, Camera module, raspberry. | Navigation, obstacle identification and voice alerts | Embedded | IoT | [20] |
Visual | Navigation (Smart Cane) | Ultrasonic, Speaker, vibration motors, GSM | Navigation, obstacle identification, voice alerts and GSM | Embedded | IoT | [21] |
Visual | Navigation (Smart Cane) | GPS and Speaker | Navigation, and buildings location | Embedded | IoT | [22] |
Visual | Navigation (Smart Cane) | Ultrasonic, Humidity sensor, LDR sensor. | Navigation, obstacle identification. | Embedded | IoT and Image Recognition | [23] |
Visual | Navigation (Smart Cane) | Laser, Camera module, GPS, speaker | Navigation, obstacle identification | Embedded | IoT | [24] |
Qty. | Description | Qty. | Description |
---|---|---|---|
01 | 18650 Lithium Battery Charger | 01 | Mini vibration motor |
01 | TP4056 battery charger module | 01 | A9G Module (GSM/GPRS/GPS) |
01 | MT3608 Step-Up Adjustable DC-DC Switching Boost Converter | 01 | ESP32 Module |
01 | HC-SR04 ultrasonic sensor | 01 | Power Button for A9G |
01 | MPU-6050 sensor module | 01 | Speaker (8W) |
01 | PAM8403 Audio Amplifier Module | 01 | Push Button Switch |
01 | APDS-9960 Module Sensor (RGBC) | 01 | Led Power supply |
Qty. | Receiver Module Components | Qty. | Transmitter Module Components |
---|---|---|---|
01 | PAM 8403 Audio amplifier | 01 | Arduino nano |
01 | LM 339.4 analog comparators | 01 | Power Led |
01 | Nano Arduino | 01 | TCRT500 emitting infrared led |
01 | Power Led | 01 | Li-ion 18,650 battery |
01 | TCRT500 photo-transistor | 01 | TP4056 battery charger |
01 | Cell phone Li-ion battery | 01 | Charge/discharge switch |
01 | Charge/discharge switch | 01 | MT 3608 converter module DC-DC |
01 | TP4056 battery charger | ||
01 | MT 3608 converter module DC-DC |
Qty. | Description | Qty. | Description |
---|---|---|---|
01 | 18,650 Lithium battery | 01 | Buzzer |
01 | GUVA S12—UV sensor | 01 | A9G Module (GSM/GPRS/GPS) |
01 | ADXL 335—Accelerometer | 01 | Power button for the A9G |
01 | Start/load switch | 01 | LED power supply |
01 | Arduino Nano | 01 | MT3608 |
01 | TP4056 (Li-Ion Battery Charger) | 01 | HC-SR04 ultrasonic sensor |
01 | Buzzer | 01 | Mini vibration motor |
Summary Table: Effectiveness (%) | |||||||||
---|---|---|---|---|---|---|---|---|---|
Distance (cm) | |||||||||
Color Line | 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 |
Red | 16.67 | 53.33 | 86.67 | 96.67 | 100.00 | 96.67 | 93.33 | 83.33 | 80.00 |
Green | 6.67 | 50.00 | 83.33 | 100.00 | 100.00 | 93.33 | 93.33 | 86.67 | 83.33 |
Blue | 10.00 | 36.67 | 83.33 | 96.67 | 100.00 | 100.00 | 93.33 | 86.67 | 76.67 |
Black | 6.67 | 30.00 | 80.00 | 93.33 | 100.00 | 100.00 | 90.00 | 86.67 | 73.33 |
Distance Object (cm) | Measured Distance | Error (%) |
---|---|---|
50 | 49.8 | 0.40 |
100 | 101.50 | 1.50 |
150 | 152.30 | 1.53 |
200 | 203.20 | 1.60 |
250 | 249.70 | 0.12 |
300 | 307.00 | 2.33 |
350 | 369.00 | 5.43 |
400 | 320.00 | 20.00 |
450 | 325.00 | 27.78 |
D (cm) | T 1 (s) | T 2 (s) | T 3 (s) | T 4 (s) | T 5 (s) | T 6 (s) | T 7 (s) | T 8 (s) | T 9 (s) | T 10 (s) | Average |
---|---|---|---|---|---|---|---|---|---|---|---|
1 | 1.4 | 2.8 | 2.32 | 2.22 | 2.72 | 1.68 | 2.6 | 2.2 | 1.68 | 2.85 | 2.247 |
2 | 1.88 | 2.13 | 2.28 | 2.55 | 2.83 | 2.28 | 3.01 | 1.95 | 2.12 | 1.98 | 2.301 |
3 | 2.35 | 2.18 | 1.82 | 2.08 | 2.48 | 2.08 | 2.78 | 2.58 | 2.47 | 1.64 | 2.246 |
4 | 1.95 | 2.26 | 2.1 | 1.52 | 2.55 | 2.15 | 2.18 | 2.52 | 2.78 | 2.68 | 2.269 |
5 | 3.02 | 2.58 | 1.95 | 2.65 | 2.68 | 2.24 | 2.9 | 2.93 | 2.6 | 2.8 | 2.635 |
6 | 2.48 | 3.5 | 3.12 | 3.08 | 3.1 | 2.57 | 2.8 | 2.95 | 2.2 | 2.5 | 2.83 |
7 | 2.37 | 1.99 | 2.31 | 3.21 | 2.55 | 2.95 | 2.25 | 2.68 | 2.81 | 2.88 | 2.6 |
8 | 2.27 | 3.12 | 2.67 | 3.13 | 1.85 | 2.88 | 2.58 | 3.14 | 2.05 | 2.82 | 2.651 |
9 | 1.97 | 2.18 | 2.37 | 2.86 | 3.63 | 2.4 | 3.2 | 3.22 | 2.92 | 2.58 | 2.733 |
10 | 2.52 | 2.49 | 2.23 | 2.42 | 4.23 | 3.72 | 2.82 | 2.33 | 2.42 | 2.6 | 2.778 |
UV MAX | |||
---|---|---|---|
Expected Value | 12 | ||
N° | Index | N° | Index |
1 | 8 | 11 | 8 |
2 | 8 | 12 | 7 |
3 | 7 | 13 | 9 |
4 | 8 | 14 | 8 |
5 | 9 | 15 | 9 |
6 | 9 | 16 | 9 |
7 | 8 | 17 | 8 |
8 | 8 | 18 | 9 |
9 | 8 | 19 | 8 |
10 | 8 | 20 | 8 |
Average | 8.2 |
Disabled Person | Devices | A | B | C | D | E | F | G |
---|---|---|---|---|---|---|---|---|
DP-1 | Smart Cane | 3 | 5 min | 100% | 2 | 30 min | 100% | High |
Smart Cap | 4 | 10 min | 90% | 1 | 30 min | 100% | High | |
Smart Glove | 1 | 5 min | 100% | 0 | 20 min | 80% | Low | |
DP-2 | Smart Cane | 3 | 10 min | 80% | 4 | 30 min | 100% | High |
Smart Cap | 4 | 10 min | 80% | 2 | 30 min | 90% | Low | |
Smart Glove | 1 | 10 min | 100% | 0 | 20 min | 100% | High |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Beingolea, J.R.; Zea-Vargas, M.A.; Huallpa, R.; Vilca, X.; Bolivar, R.; Rendulich, J. Assistive Devices: Technology Development for the Visually Impaired. Designs 2021, 5, 75. https://doi.org/10.3390/designs5040075
Beingolea JR, Zea-Vargas MA, Huallpa R, Vilca X, Bolivar R, Rendulich J. Assistive Devices: Technology Development for the Visually Impaired. Designs. 2021; 5(4):75. https://doi.org/10.3390/designs5040075
Chicago/Turabian StyleBeingolea, Jorge Rodolfo, Miguel A. Zea-Vargas, Renato Huallpa, Xiomara Vilca, Renzo Bolivar, and Jorge Rendulich. 2021. "Assistive Devices: Technology Development for the Visually Impaired" Designs 5, no. 4: 75. https://doi.org/10.3390/designs5040075
APA StyleBeingolea, J. R., Zea-Vargas, M. A., Huallpa, R., Vilca, X., Bolivar, R., & Rendulich, J. (2021). Assistive Devices: Technology Development for the Visually Impaired. Designs, 5(4), 75. https://doi.org/10.3390/designs5040075