Next Article in Journal
Flexible Mechanical Sensors for Plant Growth Monitoring: An Emerging Area for Smart Agriculture
Previous Article in Journal
Vertex-Oriented Method for Polyhedral Reconstruction of 3D Buildings Using OpenStreetMap
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Advancements in Smart Wearable Mobility Aids for Visual Impairments: A Bibliometric Narrative Review

Department of Industrial Design, Guangdong University of Technology, Guangzhou 510006, China
*
Author to whom correspondence should be addressed.
Sensors 2024, 24(24), 7986; https://doi.org/10.3390/s24247986
Submission received: 19 September 2024 / Revised: 10 December 2024 / Accepted: 12 December 2024 / Published: 14 December 2024
(This article belongs to the Section Wearables)

Abstract

:
Research into new solutions for wearable assistive devices for the visually impaired is an important area of assistive technology (AT). This plays a crucial role in improving the functionality and independence of the visually impaired, helping them to participate fully in their daily lives and in various community activities. This study presents a bibliometric analysis of the literature published over the last decade on wearable assistive devices for the visually impaired, retrieved from the Web of Science Core Collection (WoSCC) using CiteSpace, to provide an overview of the current state of research, trends, and hotspots in the field. The narrative focuses on prominent innovations in recent years related to wearable assistive devices for the visually impaired based on sensory substitution technology, describing the latest achievements in haptic and auditory feedback devices, the application of smart materials, and the growing concern about the conflicting interests of individuals and societal needs. It also summarises the current opportunities and challenges facing the field and discusses the following insights and trends: (1) optimization of the transmission of haptic and auditory information while multitasking; (2) advance research on smart materials and foster cross-disciplinary collaboration among experts; and (3) balance the interests of individuals and society. Given the two essential directions, the low-cost, stand-alone pursuit of efficiency and the high-cost pursuit of high-quality services that are closely integrated with accessible infrastructure, the latest advances will gradually allow more freedom for ambient assisted living by using robotics and automated machines, while using sensor and human–machine interaction as bridges to promote the synchronization of machine intelligence and human cognition.

1. Introduction

Assistive technology (AT) is a broad field of study that enables people with visual impairments to overcome various physical, social, infrastructural, and accessibility barriers [1]. A central area in the field is the development of assistive technology for people who are blind or partially sighted (BLV). According to the World Health Organization, it is estimated that blindness affects approximately 39 million people; additionally, 246 million people have low vision [2]. Visual impairment prevents affected individuals from visually accessing a wide range of information about their surroundings, affecting their ability to participate in a range of social and community activities [3], resulting in barriers including reduced independence and safety, as well as education, learning, and employment difficulties. A range of different types of assistive devices for the visually impaired have been developed to overcome these barriers, including electronic travel aids (ETAs), electronic orientation aids (EOAs), hybrid assistive devices (HADs), and ADL devices. The development of smaller and increasingly powerful computing hardware permits the design of portable and wearable assistive technologies [4]. Currently, wearable assistive devices for the visually impaired are becoming an increasingly popular direction. Compared with other handheld or portable assistive devices, wearable devices have the advantage of integrating more invisibly into the lives of the visually impaired. Wearable assistive devices enable visually impaired people to free their hands and support the simultaneous use of other assistive devices (e.g., canes and guide dogs). The advent of innovative electronic hardware, materials, algorithms, and artificial intelligence technology has enabled the development of sophisticated wearable assistive devices for the visually impaired. These devices integrate a multitude of advanced functions while maintaining a lightweight, low-power, and highly efficient design, offering users optimal comfort and performance. When wearable assistive devices are developed as part of the personal attire of visually impaired people, they take on fashionable and aesthetic attributes. This allows assistive devices to serve visually impaired people more imperceptibly, thereby fulfilling their desire to integrate into the population. This is of great value in promoting social inclusion and social equity.
This study employed bibliometric techniques to examine the evolution, trends, and focal points of wearable assistive technology for visually impaired individuals over the past decade. On the basis of the study of hotspots over the past few years, we found a gradual shift in research focus from fundamental technological development to more integrated optimization and the exploration of user needs. The most recent research hotspots include multimodal perception and integration technologies, intelligent navigation systems based on deep learning, and interaction design, with the objective of enhancing the user experience. Furthermore, emerging research areas have attracted attention, including the application of interdisciplinary innovative materials, improvements in social acceptance, and addressing aesthetic and emotional needs that extend beyond functionality. However, existing reviews predominantly focus on technical classifications, operating principles, and implementation pathways, with relatively limited discussion on interdisciplinary technological innovation and sociocultural aspects.
This study addresses the following three key areas of research: (1) latest design results in haptic and auditory feedback; (2) application of smart materials in this direction; and (3) conflicting interests of individuals and society regarding the wearing of assistive devices for the visually impaired in public places. It also explores the opportunities and challenges associated with these three directions and makes suggestions for the design of wearable assistive devices based on sensory substitution technology. The design recommendations are provided as a reference for related researchers.

2. Materials and Methods

All data from the literature used in this study were obtained from the Web of Science (WoS) core database, and the data were retrieved using an advanced search in WoS. The final search formula was as follows: TS = ((wearable* OR “portable device” OR “portable technology” OR “body-mounted device” OR “body-worn device”) AND (“blind people” OR “blind person” OR “visually impaired” OR “visually challenge” OR “visual impair” OR “visual challenge” OR “visual impairment” OR “visual impairments” OR “vision impaired” OR blind OR blindness). The search was conducted on 18 July 2024, and the search parameters were narrowed to articles published between 2014 and 2024 and written in the English language. The following sources were excluded from the search: review articles, conference abstracts, editorial material, and news items. The remaining articles were filtered, resulting in a total of 1057 articles. Because the basic keywords for wearable assistive devices for the visually impaired always appear in the author keywords and abstracts, searching by title (TI) alone would have missed a large number of articles. Therefore the bibliometric search was performed on the subject (TS) rather than on the title (TI). Subsequently, the literature data were imported into EndNote v2.0 for de-duplication, and articles not related to the study topic were excluded by manually reading the titles, keywords, and abstracts; if there was any uncertainty in the relevance, the full text was searched for exclusion, resulting in a total of 533 articles related to the study topic.
This study presents a narrative review based on bibliometrics. The initial stage of the study employed CiteSpace, a Java-based system with broad accessibility, to detect and visualize emerging trends and transient patterns in the scientific literature [5]. This study employed bibliometric methods to examine the literature on wearable assistive devices for the visually impaired. These methods include an analysis of author cooperation, literature co-citation, keyword co-occurrence, and keyword clustering. This analysis provides a clear understanding of author cooperation, the most influential authors, research hotspots, and the evolution of research frontiers in this field. The second half of the article, however, initiates the narrative by focusing on one of the more innovative research hotspots, i.e., wearable assistive devices for the visually impaired based on sensory substitution technology.

3. Results

3.1. Overview of the Results of the Bibliometric Analysis

The following parameter settings were applied in the CiteSpace analysis: the time span was from January 2014 to June 2024; the time slice length was one; the subject term sources were fully selected by default; the thresholds were maintained as system defaults; and the paths were simplified using the keyword path method.

3.1.1. Analysis of Annual Publications

The research can be divided into two distinct phases, as shown in Figure 1. The first phase, spanning 2014 to 2018, is characterized by a gradual growth trend, reaching a peak of 70 articles in 2018. During this period, wearable assistive robotics was still in its nascent stages, with the advent of pioneering solutions, such as intelligent navigation systems and obstacle detection systems, contributing to the observed growth. The second phase is from 2018 to 2023. Since 2018, the number of publications has exhibited a downward trend, with studies still being published but at an overall decreasing rate. This decline may be attributed to the saturation of research in the field or a shift in focus towards other areas. Around 2018, research on basic technologies began to stabilise and the focus of related technologies shifted to the application and optimization phase. Early results entered the commercialization phase, research hotspots shifted to new directions, and saturation of research in traditional fields may have led to a decline in the number of publications after 2018.

3.1.2. Most Influential Journals: A Co-Citation Analysis

A co-citation journal map is a valuable tool for researchers, enabling them to assess the impact and status of different journals. As shown in Table 1, in terms of the number of publications, the academic journals Lecture Notes in Computer Science (205) and Sensors (168) were the most prolific. On the other hand, IEEE Transactions on Systems, Man, and Cybernetics: Systems (0.11); IEEE Transactions on Biomedical Engineering (0.11); Disability and Rehabilitation: Assistive Technology (0.11); and Proceedings of the 29th Annual CHI Conference on Human Factors in Computing Systems (0.11) represent academic journals with a greater degree of centrality. As illustrated in Figure 2, these journals encompass a plethora of scientific disciplines, including computer science, engineering and technology, and neurology.

3.1.3. Analysis of the Author Collaboration Network

It should be noted that an author’s affiliation, “Association for Computing Machinery”, was incorrectly identified as author information by CiteSpace. Consequently, the “Association for Computing Machinery” node was removed to ensure the accuracy of the author collaboration network analysis graph.
Table 2 presents a list of the top 10 authors in terms of publications. Figure 3 illustrates that the field has developed a significant collective of authors centred on Wang, Kaiwei, and Leah Findlater, surrounded by numerous smaller collaborative teams with limited inter-team connections. At the same time, there are also a number of cases in which there are only two nodes with a single connectivity or an isolated node. In accordance with Price’s law, if we assume that the number of papers produced by the most productive authors in a specific field is Nmax, then M can be expressed as 0.749(Nmax)1/2. In this field, authors who have published more than M articles are defined as core authors in this study. It can be observed that Nmax is equal to 23, with an M value of approximately 3.35, rounded upward. This indicates that those who have published more than four articles are considered core authors. The statistical analysis reveals that there are 15 core authors in the sampled literature, with a total of 114 articles, representing approximately 21.39% of the total number of papers. In accordance with Price’s law, the number of papers published by these core authors is less than half of the total number of research works in this field. Therefore, it can be concluded that the core author group in this research field has not yet formed.

3.1.4. Analysis of Co-Citation Reference Network

In a co-citation reference network, consistently highly cited literature is considered as the classic literature within a field, which may reflect the developmental lineage and research foundations of the field. Table 3 lists information on the top 10 studies in terms of co-citation frequency. As shown in Figure 4, among the top three authors, Bourne focused on global estimates, trends, and projections of global blindness and visual impairment [6]. Aladren proposed a new NAVI system based on visual and range information, a system that assists or guides people with impaired vision by means of sound commands [7]. Elmannai conducted a comparative survey of wearable and portable assistive devices for the visually impaired, identifying the most important devices and highlighting their improvements, advantages, disadvantages, and level of accuracy [8].

3.1.5. Keyword Co-Occurrence and Clustering

Keywords reflect the interrelationships among various topics in the literature and are a summary of the central themes of an article. Keyword analysis facilitates the study of hotspots in a field. Figure 5 illustrates a keyword co-occurrence map. Keywords with high centrality indicate significant research developments and key turning points. Table 4 and Table 5 show the top 10 keywords in terms of frequency and centrality, respectively. As evidenced in Table 4 and Table 5, a high frequency of keywords does not necessarily correspond to a high centrality. For example, the node “wearable device” has a frequency of 28, which ranks it in 6th position, yet its centrality is 0.06, which ranks it in 17th place. Consequently, it is necessary to consider both the frequency and centrality of keywords to accurately assess the research hotspots. As a synthesis of Figure 5, Table 4 and Table 5 show that, in addition to the keywords “visually impaired” and “assistive technology”, the research hotspots in this field over the last decade were “computer science”, “instruments and instrumentation”, “computer vision”, “obstacle detection”, “deep learning”, “object recognition”, “materials science”, “ultrasonic sensor”, and “wearable system”.
As shown in Figure 6, the Q-value of the clustering map of the keywords is 0.6413, which is greater than 0.3, indicating that the clustering is good, and the S value is 0.8731, which is greater than 0.5, indicating that the clustering confidence test results are good. The mapping produced 16 clustering labels, as shown in Table 6. The clusters are numbered #0–#15 and, the smaller the cluster number, the more keywords the cluster contains. The numbers of keywords in clusters #13–#15 are less than 10; the clustering was ineffective, so these four clusters are excluded from discussion. Table 6 provides details of the thirteen valid clusters #0–#12. Most of the connectors in the plot are contained within the clusters, but several cross-spacing class connectors are also present. Among them, the six clusters #0, #2, #3, #6, #4, and #10 have more connecting lines across distance classes, indicating a high degree of co-citation among these research directions, which are more similar in terms of the subject matter of their research. A synthesis of Figure 6 and Table 6 demonstrates that #0 represents the core, indicating that computer science, including fields such as algorithms, data processing, and system development, is the fundamental field of research for this device. Cluster #1 focuses on the development and application of core technologies for the device, including sensory substitution technology and assistive functions. Cluster #2 explores the design and user experience of the device. Cluster #8 focuses on innovations in smart materials and the research of new sensor materials. Finally, cluster #6 focuses on the application of perception technology in real-world scenarios.
As shown in Figure 7, a timeline map is generated based on the keyword clustering map. As we can see from Figure 7, more hot keywords appear around 2014. This period marked the initial stage of research, with a primary focus on fundamental technology research and development, including visual perception technology [9] and obstacle detection algorithms [10]. From 2015 to 2020, there was a proliferation of both hot keywords and connections, accompanied by a rapid expansion in the technological innovation and application scenarios. The fields involved have expanded to include computer vision [11], augmented reality [12], deep learning [13], multi-sensor fusion [14], etc., with greater emphasis on technology integration and user experience optimization [15]. This proliferation can be attributed to the accelerated advancement of nascent technologies, such as the Internet of Things and deep learning, coupled with an escalating demand in the social market for assistive technologies for the visually impaired and the incremental adoption of products like smart watches and health monitoring devices. After 2020, research topics tend to mature and converge, with a decrease in the density of hot keywords and connections, but there is still some exploratory research (such as smart materials and social assistive technologies) gradually emerging. The clusters from 2014 to 2020 focused mainly on the research and development of hardware and basic technologies, while the clusters in the last five years have paid more attention to how to combine technologies such as multisensory feedback [16], artificial intelligence [17], and new materials [18] to improve device performance and user experience.

3.2. Wearable Assistive Devices for the Visually Impaired Based on Sensory Substitution Technology

Sensory substitution [19], which refers to the transformation of an impaired sensory form into another form perceivable by the user, has been a hotspot in research within the field of wearable visual impairment aids over the last decade. Visually impaired people suffer from visual deficits that prevent them from obtaining sufficient visual information from their surroundings to support them in performing various tasks in their daily lives. Sensory substitution technology has been widely used in wearable assistive devices for the visually impaired, i.e., replacing vision with another form of user-perceivable stimuli (e.g., auditory or tactile stimuli) to provide various forms of feedback from the surrounding environment. There are numerous studies on sensory substitution technologies, making it difficult to comprehensively discuss them all, so this study selected one of the more innovative directions as an introduction.

3.2.1. Haptic Feedback Devices

Auditory and haptic systems are leading directions in sensory substitution technology, and both show promise as practical sensory substitution interfaces. Tactile–vision sensory substitution studies have been carried out by numerous research groups for over a century [20]. Haptic feedback devices use the human skin as a receptor to transmit information through various sensors, electronic devices, and mechanical elements, providing tactile information that can be felt through the skin, such as pressure, vibration, and stretching, which are transmitted to the brain through the afferent nerves [21] to help visually impaired people understand and respond to their environment. The skin is the largest organ of the body [22]; thus, there are numerous possibilities for the placement of haptic feedback devices on the body. One such possibility is a position that creates the least obstruction to other important sensory functions while potentially avoiding sensory overload by interfering with other communication mediums (e.g., hearing). Haptic feedback devices can convey simple information, such as path direction, spatial orientation, and obstacle size. Over time, they may evolve to describe more complex information [23], such as three-dimensional space [24], text, diagrams, and emotions. This enables individuals who are visually impaired to perform more complex tasks and fulfil their needs in a more comprehensive manner.
Vibrotactile feedback: Vibrotactile feedback has become a widely used system because of its advantages of fast response, easy signal perception, and the miniature, inexpensive nature of the device. Figueroa-Hernandez et al. presented a classic example using a smartphone camera and the transmission of haptic instructions via a wearable vibrating device to assist individuals who are visually impaired in navigating unfamiliar environments [25]. The device has the advantages of a low-profile appearance and low cost, which can serve as foundations for further expansion. As shown in Figure 8, the device was integrated into a fabric vest. Via the smartphone’s camera and the miniature NODE MCU’s signal generator sending information to a cloud server, the remote instructor was able to obtain a real-time image of the environment of a person who is visually impaired and instruct their next move through an application developed in React Native, providing feedback in the form of vibrations. In [26], a depth camera was integrated into a hand-worn wearable device to guide individuals who are visually impaired in grasping a target object in a desktop environment by delivering orientation information for the target to the user via a vibration array feedback. Skulimowski et al. designed a wearable system consisting of a processing unit attached to the user’s chest and a haptic belt placed on the user’s abdomen [27]. The 20 vibrating actuators on the haptic band indicate the shape and distance of an obstacle through the arrangement and intensity of their vibrations. Sultania with et al. focused on the difficulties in learning STEM (diagrams, signals, graphs, etc.) for the visually impaired and an economically viable, wearable haptic Braille device was designed [28]. The device is capable of changing haptics in real time to map contextual information from a blackboard to students who are visually impaired, allowing them to overcome challenges related to perceiving spatially located alphanumeric characters, which is difficult to achieve with conventional devices. Musical haptic wearables (mhws) [29] support the potential of individuals who are blind or low vision (BLV) to learn music, and MHWs with vibrotactile alerts and vibrational changes were found to be suitable for assisting in reading music and supporting technical instruction and practice. In addition to the standalone type of wearable assistive devices for the visually impaired mentioned above, there is also a focus on the deep coupling of devices with an accessibility infrastructure. Existing solutions are mainly based on large-scale infrastructure [30], such as prefabricated routes, the placement of tags (VLC, RFID, CNF, etc.), the setup of WiFi access points, and dedicated sensors. Figure 9 shows a typical case based on accessibility infrastructure. Kiyoung et al. improved APS (accessible pedestrian signaling) using an outdoor RSSI Bluetooth to locate pedestrians [31]. Eight APS devices with Bluetooth modules were placed at road crossings, where users who were visually impaired could use their smartphones to identify their current location and plan the route and time at which to safely cross the zebra crossing, with the device providing feedback via haptics and voice.
Electromechanical haptic feedback: A device transmits skin sensations, such as pressure and stretching, through various mechanical and electronic components in several parts of the body. In [32], a typical example of a wearable electromechanical haptic feedback device was presented to help individuals who are visually impaired navigate independently on a running track by providing the sensation of skin stretching. As shown in Figure 10, the system used an RGB-D camera and a microcontroller to detect the runway and calculate the steering angle for navigation, using servos to control the rotation of two latex straps that provide the haptic feedback around the waist to guide the user. Skin stretching provides a more intuitive and continuous indication of the steering angle than traditional vibration and acoustic feedback, guiding the user to more perform the appropriate maneuver more naturally without the need for excessive training. Another illustrative example of innate human innate proprioception to reduce potential cognitive load is the Aerial Guide Dog, a helium balloon aerostat drone designed for indoor navigation [33], as proposed by Zhang et al. As illustrated on the left side of Figure 11, the perception module, which is held in the user’s hand, provides a real-time traction sensation, as well as directional guidance for navigation. On the right side of Figure 11, Haobin Tan et al. focus on the assistive functions and present a prototype of a “flying guide dog” that is capable of street view semantic segmentation, recognizing traffic lights, and automatically adjusting its movement [34]. In [35], a wearable device based on light haptic cues was presented, consisting of a three-node pneumatically controlled wristband, a servo-controlled forearm attachment mechanism, and a control armband. Friendly, nonintrusive, and gentle pressure and drag sensations are generated on the user’s wrist and forearm to convey direction and distance cues. Chase et al. developed a system providing audio and haptic guidance for a user via skin stretch feedback, enabling them to explore haptic graphics on a touchscreen [36]. The system is low cost and allows for editing of the haptic guidance patterns and cues, which can be experienced remotely or reviewed independently by the user. Gandhi et al. explored the possibility of using the tongue as a practical human–machine interface, designing the TVSS as a nonsurgical, non-invasive visual prosthetic system [37].
Vibrotactile feedback has been extended from simple navigation tasks to complex tasks, such as learning music. As the demands and complexity of information transfer increase, the limitations of the inadequate information capacity of single vibrotactile feedback [38] become apparent. Achieving good spatial and semantic perceptions of a user’s environment require extensive training [39], for which the feedback must be easy to learn. Current research shows that multiple haptic actuators or the integration of other sensory feedback are often necessary to enhance the effectiveness and legibility of complex information transfer. The main challenges in this research include the limited sensitivity of the skin to tactile stimuli and the requirement for a minimum distance between actuators [40]. Further research is needed to determine optimal positions for haptic actuators and appropriate feedback types to improve information recognizability. Multimodal feedback has emerged as a potential solution to these challenges. Research combining VR and AR [41] also suggests that integrating vibrotactile feedback with other sensory modalities could significantly enhance user experience in future wearable devices.
Electromechanical haptic feedback devices provide more refined haptic feedback through the use of mechanical traction and motor drives. These devices can provide directional guidance through proprioception and reduce the cognitive load on the user [42]. However, these devices generally suffer from a number of technical bottlenecks, including loud noise [43], heat, high power consumption, bulky size [44] and discomfort [45] when worn. In particular, high power consumption and wear and tear can affect the lifetime and comfort of the device when worn for long periods. Although smart materials (mentioned in Section 3.2.3) have the potential to improve device comfort and reduce weight and power consumption, how to achieve efficient power transmission and feedback [46] remains an unresolved problem. Haptic interfaces remain on the threshold of more comprehensive development [47]. In the future, the further development of smart materials, flexible electronics and multimodal technology is expected to solve these problems and provide visually impaired people with a more efficient, comfortable and varied tactile feedback experience.

3.2.2. Auditory Feedback Devices

The use of auditory feedback to provide spatial information for the visually impaired takes less training time than the use of haptic feedback and is more advantageous in terms of size and cost. Auditory feedback from wearable assistive devices for the visually impaired explores various techniques and methods to improve navigation and interaction with the environment. One approach is to use computer vision and artificial intelligence to provide real-time auditory feedback, integrating image captioning [48], face recognition [49], and depth estimation to provide visually impaired people with a comprehensive understanding of their environment. The visual alternative system proposed in [50] used auditory guidance to convert visual data into speech, conveying precise path information and obstacle cues to the visually impaired, in order to enhance the safety and situational awareness of such persons travelling independently. To enable people who are visually impaired to interact confidently with people with whom they are familiar, Sabarika et al. proposed a more integrated system that recognizes familiarity and sends personalized voice messages, while also recording conversations with strangers and providing remote monitoring and timely assistance services [51]. This enables people who are visually impaired to interact confidently with acquaintances and empowers them to make informed decisions regarding strangers. Jayakumar gave greater consideration to the social interactions of people who are visually impaired and proposed a novel speech-assisted facial emotion recognition system to enhance their understanding of others’ emotions in social interactions, as well as to help people who are blind navigate social and emotional environments more effectively [52].
Another approach focuses on spatial audio in virtual reality, e.g., [53], assessing the effectiveness of spatial audio and speech cues in enhancing object perception and navigation. Finally, in terms of applications, intelligent assistance systems use handy wearable devices with cameras to convert images into auditory feedback, meeting the needs of user-friendly and cognitive efficiency design. Google Glass has become an increasingly popular research topic in the field of wearable assistive devices for the visually impaired because of its attractive design, ease of wear, powerful hardware features, and ability to implement a variety of applications. In [54], a typical use case was presented, as shown in Figure 12, in which the system uses a camera embedded in smart glasses to capture images of the surrounding environment, analyzing them using a custom vision application programming Interface (Vision API) for Microsoft Azure Cognitive Services. Users can hear speech converted from the Vision API output through the smart glasses’ speakers to obtain more detailed information about a scene. The development of AI technology has further broadened the prospects for smart glasses in this area, making it possible to provide reliable and powerful real-time solutions through voice feedback. Gupta et al. proposed a smart glasses design that uses voice assistance via built-in voice assistants and sensors, which allows users to easily access internet content through AI voice broadcasting of the search results, expanding the possibilities for people who are visually impaired to handle complex tasks [55].
The main challenges in auditory feedback research are environmental adaptability, cognitive load, and user acceptance [56]. External noise in the travel environment can affect auditory feedback, and techniques such as volume amplification or adaptive equalization are required to ensure the audibility of commands [57]. The effectiveness of auditory feedback is often compromised by high cognitive demands and the need for intuitive processing. Ref. [56] points out that the high cost of auditory feedback devices and the high cognitive load caused by the difficulty of using the system can lead to reduced user acceptance. The dynamic outdoor environment requires devices that can reliably detect and communicate information about various obstacles and environmental changes, while auditory feedback may obscure important environmental sounds, leading to safety issues with navigation. Ref. [58] proposes to solve this problem through a multimodal communication framework, using other feedback to compensate for the lack of auditory feedback.

3.2.3. Application of Smart Materials

As electromechanical haptic feedback devices provide natural and complex haptic feedback, they usually require many mechatronic components, leading to challenges such as large size, heavy weight, high energy consumption, mechanical susceptibility to wear and tear, and uncomfortable and difficult to wear devices. Researchers foresee low-cost, lightweight, compact, energy-efficient, and highly mobile devices capable of delivering diverse haptic feedback modalities in the near future [59]. In recent years, the application of smart materials to wearable assistive devices for the visually impaired has become a popular trend.
Shape memory alloys have become a relatively mature smart material in the field of wearable assistive devices for the visually impaired due to their high controllability, reliability and wide application in miniaturized actuators [60]. They can provide precise shape recovery or force feedback through temperature changes and are often used for dynamic haptic feedback and device fit adjustment [61,62] to improve user experience and device comfort. In [63], a classic case with a haptic display combining a shape memory alloy (SMA) actuator and a vibration motor is proposed, which is practical and inexpensive. As shown in Figure 13, the wearable device consists mainly of the following two sub-devices: real-time object detection and haptic information presentation. Environmental information is acquired using the camera, and afterward the compressed YOLOV3 model, deployed on a Raspberry Pi 3B+ and accelerated by NCS2, detects pedestrians in front of it in real time, and the detection results are transmitted wirelessly to the haptic presentation device based on socket communication. Although the vibration frequency of the vibration motor cannot be changed, the SMA actuator can generate vibrations with different frequencies and intensities, which can be perceived as different tactile sensations by the user. Therefore, the haptic display interface that combines the SMA actuator and the vibration motor is able to present different tactile sensations to convey obstacle information to the user. Ghodrat et al. noted the potential of SMA-based VIP haptic feedback, prototyping and evaluating the forms and parts of the body on which the haptic feedback would more likely be perceived by the user [59]. They demonstrated that a spring-form SMA wire in the form of a free-moving effector design on the skin was the most feasible and that two different types of motion, squeezing and sliding on the arm, could be effectively perceived. A wearable system is proposed in [64] that recognizes obstacles ahead of it and measures the distance in real time, suggesting safe movements via the vibration patterns of haptic gloves with SMA actuators woven into them.
Functional materials with actuation capabilities enable mechanical movement or deformation in response to external stimuli [65], providing power and motion functionality to wearable devices. In addition to relatively mature materials, such as shape memory alloys, electroactive polymers (EAPs), which undergo large deformations under an electric field [66], are being used to create lightweight and flexible haptic feedback devices to assist the visually impaired with navigation. Liquid crystal elastomers, known for their sensitivity to light or temperature stimuli, have been used in dynamic display devices to provide multimodal feedback that effectively aids Braille recognition and path guidance [67]. Sensing functional materials form the core of sensing and information acquisition in wearable devices, with flexible pressure sensors and electronic skin technologies developing rapidly in this field [68]. Liu et al. used a strategy combining flexible sensing with a memristor-based artificial neural network to design a wearable and low-cost Braille recognition system [69]. They fabricated a flexible polydimethylsiloxane (PDMS)-based pressure sensor to construct an electronic skin (E-skin) for Braille recognition. Similarly, smart fabrics based on MXene-coated yarns [70] demonstrated excellent pressure responsiveness, enabling independent sensing and high-resolution signal acquisition for tactile Braille recognition, providing innovative solutions for text learning and communication among visually impaired individuals. In the field of signal transmission and processing, materials that combine conductivity and flexibility are a major focus of research. Fully textile-based electrostatic sensors [71], integrating conductive fibers with flexible materials [72], have enabled an efficient Braille-to-speech conversion system. This system can recognize Braille in real time and vocalize it, significantly improving the efficiency of information transfer. With the growing emphasis on environmental protection, the application of eco-friendly smart materials in assistive devices for the visually impaired is gaining attention. Arbaud et al. developed the first eco-friendly wearable vibrotactile device [73], significantly reducing the proportion of non-degradable plastics in the device through the extensive use of bio-based conductive inks and biodegradable composite materials. This sustainable material innovation holds potential for large-scale production and deployment while minimizing ecological impact.

3.2.4. Conflicting Interests of the Individual and Society

Social acceptance is a key factor in the widespread adoption of wearable visual impairment aids. For wearable devices to fulfil their potential and for users to adopt a particular wearable device, people must first deem the device acceptable for themselves and others to wear [74].
For individuals, acceptance of a device is influenced by a combination of factors, such as functionality, the environment in which it is used, comfort, aesthetics, emotional needs, and evaluations by others. Although benefiting from assistive technology (AT), visually impaired people may attract unwanted attention from bystanders when using assistive devices that disclose information about their disability in public areas. Feeling stigmatized or under immense psychological pressure leads to abandonment or refusal to use an assistive device. Coupled with the fact that many assistive devices focus on the production of technology-based tools and lack consideration for the practical needs and problems users encounter in their daily lives, users frequently experience aesthetic and emotional discomfort beyond usability and accessibility [70]. Profita found that many users would hide their devices or divert them to other uses to make them appear less assistive [4]. Alternatively, they may choose to use mainstream technology (e.g., iPhones) rather than highly specialized devices, in order to avoid unwanted attention.
For the community, public concerns about wearable assistive devices for the visually impaired center on personal privacy issues. With the development of computer vision technology, many wearable assistive devices for the visually impaired are fitted with always-on cameras that may trigger negative social comments. Denning et al. report that bystanders do not want to be recorded without their permission or knowledge [75]. Even in extreme cases, negative reactions to the use of wearable computers can lead to censorship or even bans on these devices [76], such as with Google Glass in some public places. Past studies [4,76] showed that bystanders have a more positive view of a device when they are aware of additional information about the device’s use for assistive purposes or the environment in which it is used. Paradoxically, the desire of people who are visually impaired not to disclose information about their disability in public places conflicts with the desire of bystanders to be informed about assistive devices, requiring researchers to think about how to balance the conflicting interests of the two in the design stage. Profita suggested that more generic images or symbols of disability can be used, incorporating information in the terminal that turns features on and off and are disclosed at the user’s discretion, balancing the needs of both parties’ interests in conjunction with societal norms and policies, and including changes in the design or modifications to the software of the assistive tools themselves [4].
In addition, existing research on wearable assistive devices for the visually impaired focuses more on the design of replacements for the functional deficiencies of the users, neglecting the other needs of people who are visually impaired, which go beyond functionality, such as the need for aesthetics and emotions. Although people with disabilities are becoming more prevalent in social and professional life, research regarding fashion item-related motivations and the demands of consumers with disabilities remains limited [77]. Past studies [78,79] found that visually impaired people show high interest in their visual appearance and endeavor to find products that go beyond technology-based usability to suit their preferences. The authors of [70] pointed to the more specific needs of people who are visually impaired, emphasizing that they are very sensitive to the materials and finishes of objects and will assess the aesthetic quality of a product’s appearance and shape through their sense of touch. As a result, the visually impaired may express dissatisfaction with the rough, non-smooth appearance of a device and the lack of straightforward disclosure of disability information. In addition, they would like to be able to assess the level of contamination and manage the maintenance of their wearable items themselves, eliminating the assumption that VIPs neglect personal wear maintenance. Profitaz noted that personalizing the appearance of a device (i.e., bespoke aesthetics) helps to counteract the stigma associated with the use of assistive devices and inspires users to express their individuality by displaying their interests externally [4]. At the same time, understated designs or looks that reflect the mainstream and that are fashionable are perceived as appealing to users.
In recent years, increasing research has focused on the aesthetic and emotional needs of the visually impaired with more consideration given to the user’s experience. The new VLC smart backpack introduced in [80] is a typical example; VLC technology is often applied to accessibility infrastructure to assist people who are visually impaired in indoor navigation, as shown in Figure 14. The prototype was centralized in a backpack, avoiding not only drawing the attention of bystanders but also emphasizing the user’s impairment. The smart backpack was able to convert light from indoor lighting system data into audio or haptic information that can be perceived by the visually impaired. Users request location information for different points of interest through gesture recognition, which, combined with obstacle detection and vibrotactile feedback on the backpack, enhances their mobility and safety in unfamiliar public places. Aziz et al. proposed an affective design model [81] specifically tailored to visually impaired users by combining a triangulation methodology (ITM) and expert assessment to ensure a holistic interactive experience for users who are visually impaired by emphasizing affective interactions, emotions, meaningful delivery, confidence, social contact, cognitive engagement, and curiosity. Kim et al. proposed that VIPs should be designed taking into account social factors, consumers’ emotions, attitudes, and behaviors, considering their impact when making decisions regarding form, materials, details, and other design factors [70]. They used an interdisciplinary study centred on the design process in their research, aiming to integrate different perspectives, foster creativity, and provide holistic solutions while taking wider implications into account, including sustainability, social ethics, and equity. Ortiz-Escobar et al. also support the positive impact of adopting interdisciplinary research, arguing that it reduces the methodological shortcomings observed in today’s literature and helps researchers to develop better designs that meet the social, physiological, cultural, and technological needs of target users [82]. At the same time, they expressed their disapproval of the current approach that seems to be in vogue, arguing that users should be more involved in the design and development process, rather than only in the testing of the final product.

4. Discussion

This study presents a bibliometric analysis of research on wearable assistive devices for the visually impaired using CiteSpace. It addresses the following questions: How many studies on wearable assistive devices for the visually impaired were published in WoS over the last decade? Which journals have had the greatest influence? Which authors have been the most active? Is there a core group of authors? Which studies are the most classic in the field? Which research topics and trends have been most prevalent over the last decade? Because of the long cycle of data collection and compilation, as well as the time required for new research to disseminate, bibliometric analysis is able to identify long-term research hotspots and trends, but it is difficult to reflect emerging areas of research and innovation in a timely manner. Therefore, we reviewed the newly published research in the literature based on keywords and clustering results analysed by bibliometrics and found that wearable assistive devices for visually impaired persons based on sensory substitution technology have emerged in recent years with many new innovations and discussions of issues that can be summarised in the following three manners: sensory substitution technology for wearable assistive devices for the visually impaired, the application of smart materials, and research on the issue of the conflicting interests of the individual and society. Some of these issues require interdisciplinary research, and there is a paucity of review articles. Consequently, the second half of this study is centred on these three directions, with the aim of providing inspiration and suggestions for future research.

4.1. Summary of the Results of the Bibliometric Analysis

Through a bibliometric analysis, we found that the research on wearable assistive devices for the visually impaired over last decade is characterized as cross-scientific and results in a combination of academia and practice. The growth of research in 2014–2018 was driven by the development of technologies such as intelligent navigation systems and obstacle detection. From 2018–2023, there has been a decline in research enthusiasm, but overall it is stable, possibly due to a gradual saturation of research in this area or a shift in research focus to other directions. Collaborative networks in this research area are fragmented, less central and mostly small-scale, and collaboration among teams is lacking, insufficient. Authoritative collaborators have not yet emerged, and research in this field is still in a discrete state. Keyword co-occurrence and clustering analyses reveal that research in this field tends to develop towards intelligent and wearable technologies.

4.2. Optimizing Haptic and Auditory Information Transmission While Multitasking

In recent years, wearable assistive devices for the visually impaired have made significant advances in haptic and auditory feedback technology, but there are still some technical bottlenecks. Tactile feedback technology, especially vibrotactile feedback, can effectively convey navigation information and simple environmental feedback. However, the information capacity of tactile devices is limited [83], especially in complex situations, such as multi-tasking environments, where it is still a challenge to effectively convey information with limited sensory input [84]. Current vibrotactile devices mostly rely on a single feedback mode, which to some extent limits their adaptability to complex environments or tasks. There are also issues with device comfort and learning adaptability, and users need to train for a long time to get used to and effectively understand the feedback content. Electromechanical haptic feedback devices have made progress in reducing cognitive load and improving perception. For example, devices based on pneumatic control or servomotors can use human proprioception to achieve more natural feedback [85], but these devices typically have high power consumption, high noise, and are large and heavy, which limits their widespread use. In addition, the comfort and convenience [86] of the device remains a key design challenge. The application of smart materials, such as shape memory alloys and flexible sensors, has already provided some solutions. By using lighter, more flexible and more efficient materials, future haptic feedback devices could achieve breakthroughs in terms of size, power consumption, and comfort.
The following recommendations are proposed for the above research problems and possible solutions: 1. Increase the capacity of haptic feedback and multimodal integration: Given the limited information capacity of haptic feedback, multimodal fusion of haptic vibration with other forms of feedback should be explored to meet the needs of multitasking in complex environments. Existing research has shown that combining haptic feedback with visual and auditory information can improve the accuracy and efficiency of users’ perception of environmental information [87]. It is recommended that future research considers the synergy between different senses when designing prototypes, reducing sensory conflict in multitasking situations and optimizing the effectiveness of information transfer. 2. Optimize device design to improve comfort and wearability: Use new materials such as flexible materials and shape memory alloys to improve the bulky, noisy and power-hungry problems of existing electromechanical haptic devices, making them lighter and more adaptable. User experience research should also pay more attention to the comfort, convenience and long-term wearability of the device, and help to optimize the design through comprehensive evaluation. 3. Usability research in multitasking scenarios: Currently, most studies only test the effectiveness of haptic or auditory feedback devices in single-tasking scenarios and lack in-depth research on device performance in multi-tasking environments. Future research needs to extend to complex multitasking scenarios and combine psychophysical experiments to determine the optimal combination of different forms of feedback in multitasking environments [88]. In addition, user experience testing should be conducted in real-world environments, including aspects such as cognitive load, attention allocation, ease of use and overall satisfaction, to ensure the effectiveness and usability of devices in multi-tasking scenarios. 4. Investigate AI and adaptive feedback mechanisms: With the development of artificial intelligence technology, the application of AI assistants in wearable assistive devices will become an important way to enhance the intelligence level of devices. Future research should combine AI voice assistants and adaptive feedback mechanisms to enable devices to dynamically adjust their feedback methods [89] based on user behaviour and environmental changes, thereby improving the personalized service [90] level of the device. In addition, customized designs based on user needs and preferences will help improve the learning adaptability of the device and user acceptance.

4.3. Advancing the Cross-Disciplinary Application of Smart Materials in Wearable Devices

As the limitations of traditional electronic materials become increasingly apparent, smart materials offer new opportunities for the next generation of wearable assistive devices for the visually impaired, particularly in terms of mechanical flexibility, transparency and integration [91,92], which have great advantages. In particular, smart materials, such as the aforementioned shape memory alloys and electronic skin, have shown great potential in terms of haptic feedback and flexible sensors, improving wearer comfort and device adaptability. However, the practical application of smart materials still faces several technical challenges. Firstly, the interaction between the material and the skin [93] (e.g., sweat, oil) can affect the long-term stability and comfort [94] of the device. In addition, although electrostatic actuators and conductive textiles are excellent in terms of comfort and flexibility, their response speed and accuracy [65] still need to be improved. In terms of sustainability, although environmentally friendly polymer materials have been investigated, their cost and reliability in mass production and practical applications [95] need to be further verified.
The following research recommendations are therefore proposed 1. Multi-functional integration and material optimization: Future research should focus on how to optimize material properties, such as improving conductivity, flexibility and durability, and combine multifunctional integration to meet the needs of lightweight and high performance devices. In particular, for haptic feedback devices, how to balance comfort and high precision feedback remains an urgent problem to be solved. 2. Interdisciplinary cooperation and innovative design: Given the complexity of smart materials, it is recommended to promote interdisciplinary collaboration in fields such as materials science, mechanical engineering, electronics and biomedicine to accelerate the application and iteration of new materials in wearable visual aids. 3. User experience and sustainability assessment: The development of smart materials should consider not only the functionality of the materials, but also the long-term user experience of the device, such as durability, comfort and adaptability to the skin. In addition, the production process and cost of sustainable materials should be explored to reduce the environmental footprint of the device and ensure its affordability and feasibility.

4.4. Balancing the Interests of the Individual and Society

In response to public concerns about privacy and security with wearable assistive device cameras, this study suggests that devices consider using algorithms that can blur faces or encrypt data. Processing the data locally instead of sending it to the cloud allows the user to control whether facial recognition functions are turned on or off to balance the need of individuals to benefit from assistive devices with the need to protect the public’s privacy. Consideration also needs to be given to curbing the illegal collection of others’ private information through social norms and policies, and greater social awareness of new technologies is needed to mitigate negative public perceptions due to lack of understanding.
For people who are visually impaired that do not wish to disclose information about their disability, which conflicts with the desire of bystanders to be informed about assistive devices, this study suggests that disability information can be conveyed using more generic or symbolic graphics or text, with disclosure left to the user’s discretion or displayed only to a specific group of people (e.g., service personnel at a public facility) to avoid assumptions about the disability. Researchers can also consider making the appearance of the device as discreet as possible to avoid this problem, if it is not actually designed function to invade the privacy of others with its assistive features.
Wearable assistive devices for the visually impaired should fully consider their characteristics and needs and, on the basis of usability and safety, they should be as tactile as possible, with aesthetic qualities appreciated by its users. In addition, thanks to the development of smart materials, wearable assistive devices for the visually impaired can be designed in the form of lightweight, body-fitting clothing and accessories. As a result, wearable assistive devices for the visually impaired may have decorative and fashionable attributes that go beyond assistive functions. In the future, researchers may need to consider wearable assistive devices as part of a person’s attire, taking into account their fashion and aesthetic preferences, level of independent daily management and maintenance, customization, and personalized expression. Physical and psychological differences between people who are visually impaired and those who are not can lead to significant differences in their aesthetic and emotional needs, even though, among people with visual impairment, differences in needs also exist due to differences in gender, age, and cultural backgrounds. As there is limited research on the motivations and needs of visually impaired people for aesthetic fashion consumption, this study encourages researchers to consider supplementing the research in this area.
The adoption of wearable assistive devices is influenced by complex factors, including sociocultural considerations (e.g., stigma), personal values, and motivations associated with AT usage [4]. We found that the acceptance of new technologies or devices by people who are visually impaired may change over time. For example, before Google Glass was popular, the visually impaired preferred to use traditional devices, such as guide canes, to avoid attracting too much attention in public places. However, with the popularity of Google Glass, an increasing number of people are willing to use smart glasses, praising their performance and sophisticated and lightweight appearance, as if they were wearing a popular piece of technology rather than an assistive device for a disability. We also found that many studies only invite people with visual impairments to participate in testing the usability of the final product, and very few invite them to participate in the design and development processes. This situation may easily lead to the final product not meeting the real everyday needs of the visually impaired or being discarded because of social acceptance issues. This study suggests that it is necessary for researchers to consider the impact of current sociocultural factors or those predicted over the next few years; to invite people with visual impairments or researchers from other fields to participate in the design and development processes; and to increase testing of their social acceptance to avoid the design being affected by the limitations of single-domain knowledge and perceptions.

4.5. Two Future Developments

Summarizing the research and discussion of the three directions above, we believe that. in the future, the design of wearable assistive devices for the visually impaired based on sensory substitution technology will move towards a comprehensive solution that is lightweight, energy efficient, highly integrated, highly intelligent, takes full account of the aesthetic and emotional needs of the user, and is easy to maintain and manage. We believe that there are two directions with great potential that researchers should further explore and expand. The first is the low-cost pursuit of efficiency and independence, and the second is the high-cost pursuit of high-quality services and in-depth coupling with the construction of barrier-free infrastructure. These two directions are proposed in consideration of the uneven development among different regions and populations in terms of economy, technology, and accessibility infrastructure construction.
The low-cost, efficiency-seeking, and stand-alone direction considers the simplification and diffusion of wearable assistive devices for the visually impaired so that more people can afford to use these devices to meet their basic daily living needs. It requires simpler designs, more economical materials, and the optimization of existing technologies to reduce production and usage costs. Specifically, a simple assistive device based on a user’s existing smartphone can be developed, connecting via Bluetooth and other wireless technologies, using the camera, sensors, and computing power of the mobile phone or combined with a cloud processor to process the collected environmental data, and guiding the user via haptic feedback or providing voice prompts through a headset. Assistive devices can be integrated into belt pouches, waistcoats, and jackets. Not only is it easy to place and operate a smartphone, but the user can effectively perceive the haptic feedback in comfort. This allows for an unobtrusive, everyday appearance that does not draw too much attention from onlookers. Haptic feedback devices can consider using vibrating actuators combined with low-cost new material components (e.g., incorporating SMA wires into haptic actuators) to provide richer haptic feedback information. They are inexpensive, with easy to sense feedback, efficient with low power consumption, and easily integrated into textiles.
The pursuit of high-quality services at high cost and deeply integrated with accessibility infrastructure suggests the use of existing mature devices and platforms. High-performance wearable assistive devices for the visually impaired will be integrated in depth with smart cities and accessibility infrastructure to provide high-precision navigation and environmental sensing capabilities. Meanwhile, seamless integration with surrounding smart infrastructure (bus stops, traffic lights, indoor lighting systems, indoor navigation systems, etc.) can be achieved to improve the mobility and safety of visually impaired people. Google Glass is an ideal device with highly integrated functions, an elegant and lightweight appearance, and high social acceptance. Its powerful sensors, computing power, and various applications not only help people who are visually impaired to complete basic tasks, such as navigation and obstacle detection, but feature AI-based voice assistants combined with a variety of wearable haptic feedback devices to achieve real-time access to the Internet for shopping, learning, entertainment, and other more diverse needs. As this direction pursues high-quality services, wearable haptic feedback devices can apply more intelligent materials and wearable soft technologies. In the achievement of high functional efficiency, the “fashion” aspect and the maintenance and management needs of the wearable device should be further considered to satisfy the higher aesthetic and emotional needs of the visually impaired.

5. Conclusions

Wearable assistive devices for the visually impaired are an important aspect of assistive technology and robotics research. This study is a bibliometric-based narrative review of the research on wearable assistive devices for the visually impaired over the past decade, and it provides researchers with insights into the current status of research and the trends and hotspots in this field. Wearable assistive devices based on sensory substitution technology are a prominent direction of research among the hotspots in this field, giving rise to numerous innovative interdisciplinary designs and studies. It was found that advances in sensory substitution technologies have provided more innovative feedback methods and feedback interfaces for wearable assistive devices for the visually impaired, offering promising solutions for achieving accessibility and independence in the daily lives of the visually impaired. The application of smart materials in this field has been particularly effective, helping to overcome the limitations of traditional electronic materials and enabling these wearable assistive devices to be thin, lightweight, low power, and comfortable to wear, fitting the body’s curves. However, acceptance by individuals and society is becoming an important issue for the adoption of wearable assistive devices for the visually impaired, and balancing the interests of both parties is key to solving this challenge. In turn, the following recommendations for the design of wearable assistive devices are proposed: (1) optimize the transmission of haptic and auditory information while multitasking; (2) advance research on smart materials and foster cross-disciplinary collaboration among experts; and (3) balance the interests of individuals and society. In addition, the following two possible directions for future development are proposed: low-cost pursuit of efficiency and independence, and high-cost pursuit of high-quality services that are deeply integrated with accessibility infrastructure.

Author Contributions

Conceptualization, X.Z. and X.X.; methodology, X.H., X.Z. and X.X.; software, Y.D. and X.H.; validation, Y.D., X.H. and W.L.; formal analysis, X.Z. and X.H.; writing—original draft preparation, Y.D., L.L. and X.H.; writing—review and editing, X.H. and X.X.; visualization, X.H., L.L. and W.L.; supervision, X.X.; project administration, X.Z. and X.X.; funding acquisition, X.Z. and X.X. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by grants from the Guangdong Philosophy and Social Science Foundation 2023 (GD23XYS016) and the Humanity Design and Engineering Research Team (263303306).

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Bhowmick, A.; Hazarika, S.M. An insight into assistive technology for the visually impaired and blind people: State-of-the-art and future trends. J. Multimodal User Interfaces 2017, 11, 149–172. [Google Scholar] [CrossRef]
  2. Blindness and Vision Impairment. Available online: https://www.who.int/news-room/fact-sheets/detail/blindness-and-visual-impairment (accessed on 15 July 2024).
  3. Trillo, A.H.; Dickinson, C.M. The Impact of Visual and Nonvisual Factors on Quality of Life and Adaptation in Adults with Visual Impairment. Investig. Ophthalmol. Vis. Sci. 2012, 53, 4234–4241. [Google Scholar] [CrossRef] [PubMed]
  4. Profita, H.P. Designing Wearable Assistive Computing Devices to Support Social Acceptability and Personal Expression. Ph.D. Thesis, University of Colorado, Boulder, CO, USA, 2017. [Google Scholar]
  5. Chen, C.M. CiteSpace II: Detecting and visualizing emerging trends and transient patterns in scientific literature. J. Am. Soc. Inf. Sci. Technol. 2006, 57, 359–377. [Google Scholar] [CrossRef]
  6. Bourne, R.R.; Flaxman, S.R.; Braithwaite, T.; Cicinelli, M.V.; Das, A.; Jonas, J.B.; Keeffe, J.; Kempen, J.H.; Leasher, J.; Limburg, H. Magnitude, temporal trends, and projections of the global prevalence of blindness and distance and near vision impairment: A systematic review and meta-analysis. Lancet Glob. Health 2017, 5, e888–e897. [Google Scholar] [CrossRef]
  7. Aladrén, A.; López-Nicolás, G.; Puig, L.; Guerrero, J.J. Navigation Assistance for the Visually Impaired Using RGB-D Sensor with Range Expansion. IEEE Syst. J. 2016, 10, 922–932. [Google Scholar] [CrossRef]
  8. Elmannai, W.; Elleithy, K. Sensor-Based Assistive Devices for Visually-Impaired People: Current Status, Challenges, and Future Directions. Sensors 2017, 17, 565. [Google Scholar] [CrossRef]
  9. Bismark Kweku, A.A. Development of a Wearable Assistive Device for Navigation for the Visually Impaired with Command and Request Support. Ph.D. Thesis, Soka University, Tokyo, Japan, 2024. [Google Scholar]
  10. Almajdoub, R.A.; Shiba, O.S. An Assistant System for Blind To Avoid Obstacles Using Artificial Intelligence Techniques. Int. J. Eng. Inf. Technol. (IJEIT) 2024, 12, 226–238. [Google Scholar] [CrossRef]
  11. Panwar, M.; Dhankhar, A.; Rajoria, H.; Soreng, J.; Batsyas, R.; Kharangarh, P.R. Innovations in Flexible Sensory Devices for the Visually Impaired. ECS J. Solid State Sci. Technol. 2024, 13, 077011. [Google Scholar] [CrossRef]
  12. Lee, J.; Li, Y.; Bunarto, D.; Lee, E.; Wang, O.H.; Rodriguez, A.; Zhao, Y.; Tian, Y.; Froehlich, J.E. Towards AI-Powered AR for Enhancing Sports Playability for People with Low Vision: An Exploration of ARSports. In Proceedings of the 2024 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct), Bellevue, WA, USA, 21–25 October 2024; pp. 228–233. [Google Scholar]
  13. Kumar, B.D.B.; Sai, B.U.; Karthik, S.S. AI-Integrated Smart Glasses for Enhancing Reading and Guidance Independence for the Visually Impaired. J. Trends Comput. Sci. Smart Technol. 2024, 6, 235–247. [Google Scholar]
  14. Song, Y.; Li, Z.; Li, G.; Wang, B.; Zhu, M.; Shi, P. Multi-sensory visual-auditory fusion of wearable navigation assistance for people with impaired vision. IEEE Trans. Autom. Sci. Eng. 2023. [Google Scholar] [CrossRef]
  15. Koutny, R. Exploring space: User interfaces for blind and visually impaired people for spatial and non-verbal information. In Proceedings of the International Conference on Computers Helping People with Special Needs, Linz, Austria, 8–12 July 2024; pp. 267–274. [Google Scholar]
  16. Duanmu, D.; Xu, T.; Li, X.; Cao, X.; Huang, W.; Hu, Y. Perceptual Feedback through Multisensory Fusion in Hand Function Rehabilitation by A Machine Learning Approach. In Proceedings of the 2024 IEEE International Conference on Computational Intelligence and Virtual Environments for Measurement Systems and Applications (CIVEMSA), Xi’an, China, 14–16 June 2024; pp. 1–5. [Google Scholar]
  17. Ghose, S.; Roy, K.; Prevost, J.J. SoundEYE: AI-Driven Sensory Augmentation System for Visually Impaired Individuals through Natural Sound. In Proceedings of the 2024 19th Annual System of Systems Engineering Conference (SoSE), Tacoma, WA, USA, 23–26 June 2024; pp. 147–152. [Google Scholar]
  18. ST, A.; Amarnath, R.N.; Gopi, B.; Selvakumar, R.; Ganesh, E.; Sujatha, S. IoT-Embedded Smart Clothing with CNN for Improved Spatial Awareness in the Visually Impaired. In Proceedings of the 2024 Second International Conference on Intelligent Cyber Physical Systems and Internet of Things (ICoICI), Coimbatore, India, 28–30 August 2024; pp. 449–454. [Google Scholar]
  19. Agrimi, E.; Battaglini, C.; Bottari, D.; Gnecco, G.; Leporini, B. Game accessibility for visually impaired people: A review. Soft Comput. 2024, 28, 10475–10489. [Google Scholar] [CrossRef]
  20. Bach-y-Rita, P.; Kercel, S.W. Sensory substitution and the human-machine interface. Trends Cogn. Sci. 2003, 7, 541–546. [Google Scholar] [CrossRef] [PubMed]
  21. Bark, K.; Wheeler, J.; Shull, P.; Savall, J.; Cutkosky, M. Rotational Skin Stretch Feedback: A Wearable Haptic Display for Motion. IEEE Trans. Haptics 2010, 3, 166–176. [Google Scholar] [CrossRef]
  22. Fakhri, B.; Panchanathan, S. Haptics for Sensory Substitution; Springer International Publishing Ag: Cham, Switzerland, 2020; pp. 89–115. [Google Scholar]
  23. Boljanić, T.; Baljić, M.; Kostić, M.; Barralon, P.; Došen, S.; Štrbac, M. Psychometric evaluation of high-resolution electrotactile interface for conveying 3D spatial information. Sci. Rep. 2024, 14, 19969. [Google Scholar] [CrossRef]
  24. Caulfield, M.; Forsyth, J.; Deportes, L.; Castaneda, D. Braille Learning using Haptic Feedback. In Proceedings of the 2024 Systems and Information Engineering Design Symposium (SIEDS), Charlottesville, VA, USA, 3 May 2024; pp. 460–465. [Google Scholar]
  25. Figueroa-Hernandez, A.G.; Perdomo-Vasquez, C.; Gomez-Escobar, D.; Galvis-Pedraza, H.; Medina-Castañeda, J.F.; González-Vargas, A.M. Haptic Interface for Remote Guidance of People with Visual Disabilities. In Proceedings of the Latin American Conference on Biomedical Engineering, Florianópolis, Brazil, 24–28 October 2022; pp. 681–689. [Google Scholar]
  26. Fei, F.; Xian, S.F.; Yang, R.N.; Wu, C.C.; Lu, X. A Wearable Visually Impaired Assistive System Based on Semantic Vision SLAM for Grasping Operation. Sensors 2024, 24, 3593. [Google Scholar] [CrossRef]
  27. Skulimowski, P.; Strumiłło, P.; Trygar, S.; Trygar, W. Haptic Display of Depth Images in an Electronic Travel Aid for the Blind: Technical Indoor Trials. In Proceedings of the Polish Conference on Biocybernetics and Biomedical Engineering, Lodz, Poland, 27–29 September 2023; pp. 443–453. [Google Scholar]
  28. Sultania, C.; Singhal, D.; Kabra, M.; Madurwar, A.; Pawar, S.; Rao, M. Wearable Haptic Braille Device for Enhancing Classroom Learning. In Proceedings of the IEEE Sensors Conference, Vienna, Austria, 29 October–1 November 2023. [Google Scholar]
  29. Lu, L.; ACM. Learning Music Blind: Understanding the Application of Technology to Support BLV Music Learning. In Proceedings of the 24th International ACM SIGACCESS Conference on Computers and Accessibility (ASSETS), Athens, Greece, 23–26 October 2022. [Google Scholar]
  30. Khan, A.; Khusro, S. An insight into smartphone-based assistive solutions for visually impaired and blind people: Issues, challenges and opportunities. Univers. Access Inf. Soc. 2021, 20, 265–298. [Google Scholar] [CrossRef]
  31. Shin, K.; McConville, R.; Metatla, O.; Chang, M.; Han, C.; Lee, J.; Roudaut, A. Outdoor localization using BLE RSSI and accessible pedestrian signals for the visually impaired at intersections. Sensors 2022, 22, 371. [Google Scholar] [CrossRef]
  32. Kayhan, O.; Samur, E.; IEEE. A Wearable Haptic Guidance System Based on Skin Stretch around the Waist for Visually-Impaired Runners. In Proceedings of the IEEE Haptics Symposium (HAPTICS), Santa Barbara, CA, USA, 21–24 March 2022. [Google Scholar]
  33. Zhang, X.; Pan, Z.; Song, Z.; Zhang, Y.; Li, W.; Ding, S. The aerial guide dog: A low-cognitive-load indoor electronic travel aid for visually impaired individuals. Sensors 2024, 24, 297. [Google Scholar] [CrossRef]
  34. Tan, H.; Chen, C.; Luo, X.; Zhang, J.; Seibold, C.; Yang, K.; Stiefelhagen, R. Flying guide dog: Walkable path discovery for the visually impaired utilizing drones and transformer-based semantic segmentation. In Proceedings of the 2021 IEEE International Conference on Robotics and Biomimetics (ROBIO), Sanya, China, 27–31 December 2021; pp. 1123–1128. [Google Scholar]
  35. He, L.; Wang, R.L.; Xu, X.H.; Assoc Comp, M. PneuFetch: Supporting Blind and Visually Impaired People to Fetch Nearby Objects via Light Haptic Cues. In Proceedings of the ACM CHI Conference on Human Factors in Computing Systems (CHI), Honolulu, HI, USA, 25–30 April 2020. [Google Scholar]
  36. Chase, E.D.Z.; Siu, A.F.; Boadi-Agyemang, A.; Kim, G.S.H.; Gonzalez, E.J.; Follmer, S.; ACM. PantoGuide: A Haptic and Audio Guidance System To Support Tactile Graphics Exploration. In Proceedings of the 22nd International ACM SIGACCESS Conference on Computers and Accessibility (ASSETS), Electr Network, New York, NY, USA, 26–28 October 2020. [Google Scholar]
  37. Gandhi, P.; Chauhan, A.; IEEE. Sensory Vision Substitution using Tactile Stimulation. In Proceedings of the 6th International Conference for Convergence in Technology (I2CT), Electr Network, Pune, India, 2–4 April 2021. [Google Scholar]
  38. Alsuhibany, S.A. Vibration-Based Pattern Password Approach for Visually Impaired People. Comput. Syst. Sci. Eng. 2022, 40, 341–356. [Google Scholar] [CrossRef]
  39. Kamalraj, R.; Vasant, M.; Akoparna, B.; Haris, P.; Subiksha, J. Human-Centric Design and Machine Learning Integration in Smart Footwear for Visually Impaired Individuals. Int. J. Adv. Eng. Manag. 2024, 6, 581–587. [Google Scholar]
  40. Chouvardas, V.G.; Miliou, A.N.; Hatalis, M.K. Tactile displays: Overview and recent advances. Displays 2008, 29, 185–194. [Google Scholar] [CrossRef]
  41. Liu, T.; Fazli, P.; Jeong, H. Artificial Intelligence in Virtual Reality for Blind and Low Vision Individuals: Literature Review. Proc. Hum. Factors Ergon. Soc. Annu. Meet. 2024, 10711813241266832. [Google Scholar] [CrossRef]
  42. Barontini, F.; Catalano, M.G.; Pallottino, L.; Leporini, B.; Bianchi, M. Integrating wearable haptics and obstacle avoidance for the visually impaired in indoor navigation: A user-centered approach. IEEE Trans. Haptics 2020, 14, 109–122. [Google Scholar] [CrossRef] [PubMed]
  43. Ahmed, T. Towards the Design of Wearable Assistive Technologies to Address the Privacy and Security Concerns of People with Visual Impairments. Ph.D. Thesis, Indiana University, Bloomington, IN, USA, 2019. [Google Scholar]
  44. Okolo, G.I.; Althobaiti, T.; Ramzan, N. Assistive systems for visually impaired persons: Challenges and opportunities for navigation assistance. Sensors 2024, 24, 3572. [Google Scholar] [CrossRef] [PubMed]
  45. Ling, D.K.X. A Finger-Mounted Obstacle Detector for People with Visual Impairment. Master’s Thesis, Swinburne, Melbourne, Australia, 2018. [Google Scholar]
  46. Ford, M.J.; Ohm, Y.; Chin, K.; Majidi, C. Composites of functional polymers: Toward physical intelligence using flexible and soft materials. J. Mater. Res. 2022, 37, 2–24. [Google Scholar] [CrossRef]
  47. Shull, P.B.; Damian, D.D. Haptic wearables as sensory replacement, sensory augmentation and trainer—A review. J. Neuroeng. Rehabil. 2015, 12, 59. [Google Scholar] [CrossRef]
  48. Fernando, S.; Ndukwe, C.; Virdee, B.; Djemai, R. Image recognition tools for blind and visually impaired users: An emphasis on the design considerations. ACM Trans. Access. Comput. 2024. [Google Scholar] [CrossRef]
  49. Li, G.; Xu, J.; Li, Z.; Chen, C.; Kan, Z. Sensing and navigation of wearable assistance cognitive systems for the visually impaired. IEEE Trans. Cogn. Dev. Syst. 2022, 15, 122–133. [Google Scholar] [CrossRef]
  50. Scalvini, F.; Bordeau, C.; Ambard, M.; Migniot, C.; Dubois, J. Outdoor Navigation Assistive System Based on Robust and Real-Time Visual-Auditory Substitution Approach. Sensors 2024, 24, 166. [Google Scholar] [CrossRef]
  51. Sabarika, M.; Santhoshkumar, R.; Dharson, R.; Jayamani, S. Assistive Voice Guidance System for Blind Individuals using Deep Learning Techniques. In Proceedings of the 2024 2nd International Conference on Artificial Intelligence and Machine Learning Applications Theme: Healthcare and Internet of Things (AIMLA), Namakkal, India, 15–16 March 2024; pp. 1–4. [Google Scholar]
  52. Jayakumar, D. Voice Assisted Facial Emotion Recognition System for Blind Peoples with Tensorflow Model. In Proceedings of the 2024 IEEE International Students’ Conference on Electrical, Electronics and Computer Science (SCEECS), Bhopal, India, 24–25 February 2024; pp. 1–4. [Google Scholar]
  53. Zhao, X. Hearing the World: A Pilot Study Design on Spatial Audio for the Visually Impaired. In Proceedings of the Proceedings of the 27th International Academic Mindtrek Conference, Tampere, Finland, 8–11 October 2024; pp. 244–248. [Google Scholar]
  54. Rao, S.U.; Ranganath, S.; Ashwin, T.; Reddy, G.R.M. A Google glass based real-time scene analysis for the visually impaired. IEEE Access 2021, 9, 166351–166369. [Google Scholar]
  55. Gupta, M.; Singh, M.; Chauhan, N.; Chauhan, A.S. A Novel Approach for Complete Aid to Blinds Using Voice Assisted Smart Glasses. In Proceedings of the 2023 International Conference on Sustainable Emerging Innovations in Engineering and Technology (ICSEIET), Ghaziabad, India, 14–15 September 2023; pp. 365–369. [Google Scholar]
  56. Bastola, A.; Gluck, A.; Brinkley, J. Feedback mechanism for blind and visually impaired: A review. Proc. Hum. Factors Ergon. Soc. Annu. Meet. 2023, 67, 1748–1754. [Google Scholar] [CrossRef]
  57. Galimberti, G. Auditory Feedback to compensate audible instructions to support people with visual impairment. In Proceedings of the 23rd International ACM SIGACCESS Conference on Computers and Accessibility, Online, 18–22 October 2021; pp. 1–3. [Google Scholar]
  58. van der Bie, J.; Ben Allouch, S.; Jaschinski, C. Communicating multimodal wayfinding messages for visually impaired people via wearables. In Proceedings of the 21st International Conference on Human-Computer Interaction with Mobile Devices and Services, Taipei, Taiwan, 1–4 October 2019; pp. 1–7. [Google Scholar]
  59. Ghodrat, S.; Sandhir, P.; Huisman, G. Exploring shape memory alloys in haptic wearables for visually impaired people. Front. Comput. Sci. 2023, 5, 1012565. [Google Scholar] [CrossRef]
  60. Yin, J.; Hinchet, R.; Shea, H.; Majidi, C. Wearable soft technologies for haptic sensing and feedback. Adv. Funct. Mater. 2021, 31, 2007428. [Google Scholar] [CrossRef]
  61. Franken, M. Smart Memory Alloy Actuated Slave System for Medical Robotics, with Haptic Feedback. DCT Report 2003. Master’s Thesis, Eindhoven University of Technology, Eindhoven, The Netherlands, 2003. [Google Scholar]
  62. Foo, W.Y.E. Dynamic Compression for Novel Haptic Interactions. Ph.D. Thesis, University of Minnesota, Minneapolis, MN, USA, 2020. [Google Scholar]
  63. Shen, J.J.; Chen, Y.W.; Sawada, H. A Wearable Assistive Device for Blind Pedestrians Using Real-Time Object Detection and Tactile Presentation. Sensors 2022, 22, 4537. [Google Scholar] [CrossRef]
  64. Chen, Y.; Shen, J.; Sawada, H. A wearable assistive system for the visually impaired using object detection, distance measurement and tactile presentation. Intell. Robot. 2023, 3, 420–435. [Google Scholar] [CrossRef]
  65. Peng, C.; Chen, Y.; Yang, B.; Jiang, Z.; Liu, Y.; Liu, Z.; Zhou, L.; Tang, L. Recent Advances of Soft Actuators in Smart Wearable Electronic-Textile. Adv. Mater. Technol. 2024, 9, 2400079. [Google Scholar] [CrossRef]
  66. Bae, J.-H.; Chang, S.-H. PVDF-based ferroelectric polymers and dielectric elastomers for sensor and actuator applications: A review. Funct. Compos. Struct. 2019, 1, 012003. [Google Scholar] [CrossRef]
  67. Torras, N.; Zinoviev, K.; Camargo, C.; Campo, E.M.; Campanella, H.; Esteve, J.; Marshall, J.; Terentjev, E.; Omastová, M.; Krupa, I. Tactile device based on opto-mechanical actuation of liquid crystal elastomers. Sens. Actuators A Phys. 2014, 208, 104–112. [Google Scholar] [CrossRef]
  68. Cai, Z. Wearable Pressure Sensors and Their Applications. In Proceedings of the 2024 IEEE 2nd International Conference on Image Processing and Computer Applications (ICIPCA), Shenyang, China, 28–30 June 2024; pp. 446–449. [Google Scholar]
  69. Liu, Y.H.; Wang, J.J.; Wang, H.Z.; Liu, S.; Wu, Y.C.; Hu, S.G.; Yu, Q.; Liu, Z.; Chen, T.P.; Yin, Y.; et al. Braille recognition by E-skin system based on binary memristive neural network. Sci. Rep. 2023, 13, 5437. [Google Scholar] [CrossRef]
  70. Kim, M.; Shin, H.; Jekal, M. Braille glove design toward interdisciplinary approach for visually impaired people: Developing independent sensing design with MXene and embroidery. Fash. Text. 2024, 11, 18. [Google Scholar] [CrossRef]
  71. Li, Z.Y.; Liu, Z.; Xu, S.M.; Zhang, K.J.; Zhao, D.Z.; Pi, Y.C.; Guan, X.; Peng, Z.C.; Zhong, Q.Z.; Zhong, J.W. Electrostatic Smart Textiles for Braille-To-Speech Translation. Adv. Mater. 2024, 10, 2313518. [Google Scholar] [CrossRef] [PubMed]
  72. Lou, M.N.; Abdalla, I.; Zhu, M.M.; Wei, X.D.; Yu, J.Y.; Li, Z.L.; Ding, B. Highly Wearable, Breathable, and Washable Sensing Textile for Human Motion and Pulse Monitoring. ACS Appl. Mater. Interfaces 2020, 12, 19965–19973. [Google Scholar] [CrossRef]
  73. Arbaud, R.; Najafi, M.; Gandarias, J.M.; Lorenzini, M.; Paul, U.C.; Zych, A.; Athanassiou, A.; Cataldi, P.; Ajoudani, A. Toward Sustainable Haptics: A Wearable Vibrotactile Solar-Powered System with Biodegradable Components. Adv. Mater. Technol. 2024, 9, 2301265. [Google Scholar] [CrossRef]
  74. Kelly, N.; Gilbert, S. The WEAR scale: Developing a measure of the social acceptability of a wearable device. In Proceedings of the 2016 CHI Conference Extended Abstracts on Human Factors in Computing Systems, San Jose, CA, USA, 7–12 May 2016; pp. 2864–2871. [Google Scholar]
  75. Denning, T.; Dehlawi, Z.; Kohno, T. In situ with bystanders of augmented reality glasses: Perspectives on recording and privacy-mediating technologies. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Toronto, ON, Canada, 26 April–1 May 2014; pp. 2377–2386. [Google Scholar]
  76. Profita, H.; Albaghli, R.; Findlater, L.; Jaeger, P.; Kane, S.K.; ACM. The AT Effect: How Disability Affects the Perceived Social Acceptability of Head-Mounted Display Use. In Proceedings of the 34th Annual CHI Conference on Human Factors in Computing Systems (CHI4GOOD), San Jose, CA, USA, 7–12 May 2016; pp. 4884–4895. [Google Scholar]
  77. Chang, H.J.; Yurchisin, J.; Hodges, N.; Watchravesringkan, K.; Ackerman, T. An investigation of self-concept, clothing selection motivation, and life satisfaction among disabled consumers. Fam. Consum. Sci. Res. J. 2013, 42, 162–176. [Google Scholar] [CrossRef]
  78. Liu, G.; Ding, X.; Yu, C.; Gao, L.; Chi, X.; Shi, Y. “I Bought This for Me to Look More Ordinary” A Study of Blind People Doing Online Shopping. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems, Glasgow, UK, 4–9 May 2019; pp. 1–11. [Google Scholar]
  79. Wickens, C.D.; Gordon, S.E.; Liu, Y.; Lee, J. An Introduction to Human Factors Engineering; Pearson Prentice Hall: Upper Saddle River, NJ, USA, 2004; Volume 2. [Google Scholar]
  80. Căilean, A.-M.; Avătămăniței, S.-A.; Beguni, C. Design and Experimental Evaluation of a Visible Light Communications-Based Smart Backpack for Visually Impaired Persons’ Assistance. In Proceedings of the 2023 31st Telecommunications Forum (TELFOR), Belgrade, Serbia, 21–22 November 2023; pp. 1–4. [Google Scholar]
  81. Aziz, N.; Khalid, A.; Mutalib, A.A. Evaluation of Cohesive Affective Design Model for People with Visual Challenges Through Expert Review Method. TEM J. 2024, 13, 1432–1442. [Google Scholar] [CrossRef]
  82. Ortiz-Escobar, L.M.; Chavarria, M.A.; Schönenberger, K.; Hurst, S.; Stein, M.A.; Mugeere, A.; Rivas Velarde, M. Assessing the implementation of user-centred design standards on assistive technology for persons with visual impairments: A systematic review. Front. Rehabil. Sci. 2023, 4, 1238158. [Google Scholar] [CrossRef]
  83. Tapu, R.; Mocanu, B.; Zaharia, T. Wearable assistive devices for visually impaired: A state of the art survey. Pattern Recognit. Lett. 2020, 137, 37–52. [Google Scholar] [CrossRef]
  84. Sodnik, J.; Jakus, G.; Tomažič, S. Multiple spatial sounds in hierarchical menu navigation for visually impaired computer users. Int. J. Hum.-Comput. Stud. 2011, 69, 100–112. [Google Scholar] [CrossRef]
  85. Park, J.; Lee, Y.; Cho, S.; Choe, A.; Yeom, J.; Ro, Y.G.; Kim, J.; Kang, D.-H.; Lee, S.; Ko, H. Soft Sensors and Actuators for Wearable Human–Machine Interfaces. Chem. Rev. 2024, 124, 1464–1534. [Google Scholar] [CrossRef]
  86. Dos Santos, A.D.P.; Suzuki, A.H.G.; Medola, F.O.; Vaezipour, A. A systematic review of wearable devices for orientation and mobility of adults with visual impairment and blindness. IEEE Access 2021, 9, 162306–162324. [Google Scholar] [CrossRef]
  87. Maćkowski, M.; Brzoza, P.; Kawulok, M.; Meisel, R.; Spinczyk, D. Multimodal presentation of interactive audio-tactile graphics supporting the perception of visual information by blind people. ACM Trans. Multimed. Comput. Commun. Appl. 2023, 19, 1–22. [Google Scholar] [CrossRef]
  88. Stock, A.-K.; Gohil, K.; Huster, R.J.; Beste, C. On the effects of multimodal information integration in multitasking. Sci. Rep. 2017, 7, 4927. [Google Scholar] [CrossRef] [PubMed]
  89. OhnBar, E.; Kitani, K.; Asakawa, C. Personalized dynamics models for adaptive assistive navigation systems. In Proceedings of the Conference on Robot Learning, Zürich, Switzerland, 29–31 October 2018; pp. 16–39. [Google Scholar]
  90. Lee, J.; Kim, D.; Ryoo, H.-Y.; Shin, B.-S. Sustainable wearables: Wearable technology for enhancing the quality of human life. Sustainability 2016, 8, 466. [Google Scholar] [CrossRef]
  91. Lim, H.R.; Kim, H.S.; Qazi, R.; Kwon, Y.T.; Jeong, J.W.; Yeo, W.H. Advanced soft materials, sensor integrations, and applications of wearable flexible hybrid electronics in healthcare, energy, and environment. Adv. Mater. 2020, 32, 1901924. [Google Scholar] [CrossRef]
  92. Shi, J.; Liu, S.; Zhang, L.; Yang, B.; Shu, L.; Yang, Y.; Ren, M.; Wang, Y.; Chen, J.; Chen, W. Smart textile-integrated microelectronic systems for wearable applications. Adv. Mater. 2020, 32, 1901958. [Google Scholar] [CrossRef]
  93. Chortos, A.; Liu, J.; Bao, Z. Pursuing prosthetic electronic skin. Nat. Mater. 2016, 15, 937–950. [Google Scholar] [CrossRef]
  94. Chen, W.; Lin, J.; Ye, Z.; Wang, X.; Shen, J.; Wang, B. Customized surface adhesive and wettability properties of conformal electronic devices. Mater. Horiz. 2024, 11, 6289–6325. [Google Scholar] [CrossRef]
  95. Lee, E.K.; Lee, M.Y.; Park, C.H.; Lee, H.R.; Oh, J.H. Toward environmentally robust organic electronics: Approaches and applications. Adv. Mater. 2017, 29, 1703638. [Google Scholar] [CrossRef]
Figure 1. Annual publications on wearable assistive devices for the visually impaired in WoS from 2014 to 2024.
Figure 1. Annual publications on wearable assistive devices for the visually impaired in WoS from 2014 to 2024.
Sensors 24 07986 g001
Figure 2. Co-citation journal map of the studied publications from 2014 to 2024.
Figure 2. Co-citation journal map of the studied publications from 2014 to 2024.
Sensors 24 07986 g002
Figure 3. Co-authorship map of studied publications from 2014 to 2024.
Figure 3. Co-authorship map of studied publications from 2014 to 2024.
Sensors 24 07986 g003
Figure 4. Reference co-citation map of the studied publications from 2014 to 2024.
Figure 4. Reference co-citation map of the studied publications from 2014 to 2024.
Sensors 24 07986 g004
Figure 5. Keyword co-occurrence map of the studied publications from 2014 to 2024.
Figure 5. Keyword co-occurrence map of the studied publications from 2014 to 2024.
Sensors 24 07986 g005
Figure 6. Keyword clustering map for the studied publications from 2014 to 2024.
Figure 6. Keyword clustering map for the studied publications from 2014 to 2024.
Sensors 24 07986 g006
Figure 7. Timeline map of the studied publications from 2014 to 2024.
Figure 7. Timeline map of the studied publications from 2014 to 2024.
Sensors 24 07986 g007
Figure 8. Haptic interface for remote guidance of visually impaired people [25].
Figure 8. Haptic interface for remote guidance of visually impaired people [25].
Sensors 24 07986 g008
Figure 9. Outdoor positioning of the BLE RSSI on accessible pedestrian signals at intersections for individuals who are visually impaired [31].
Figure 9. Outdoor positioning of the BLE RSSI on accessible pedestrian signals at intersections for individuals who are visually impaired [31].
Sensors 24 07986 g009
Figure 10. Wearable haptic guidance system based on lumbar skin stretching [32].
Figure 10. Wearable haptic guidance system based on lumbar skin stretching [32].
Sensors 24 07986 g010
Figure 11. Aerial Guide Dog [33] and flying guide dog [34].
Figure 11. Aerial Guide Dog [33] and flying guide dog [34].
Sensors 24 07986 g011
Figure 12. Google-Glass-based real-time scene analysis for people who are visually impaired [54].
Figure 12. Google-Glass-based real-time scene analysis for people who are visually impaired [54].
Sensors 24 07986 g012
Figure 13. A wearable assistive device for pedestrians who are blind using real-time object detection and tactile presentation [63].
Figure 13. A wearable assistive device for pedestrians who are blind using real-time object detection and tactile presentation [63].
Sensors 24 07986 g013
Figure 14. VLC-based assistive system for people who are visually impaired [80].
Figure 14. VLC-based assistive system for people who are visually impaired [80].
Sensors 24 07986 g014
Table 1. Top 10 co-citation journals of the studied publications from 2014 to 2024.
Table 1. Top 10 co-citation journals of the studied publications from 2014 to 2024.
RankNumber of Articles PublishedCentralityYearJournal
12050.032014Lecture Notes in Computer Science
21680.072014Sensors
31270.052014IEEE Conference on Computer Vision and Pattern Recognition Proceedings
4850.112014IEEE Transactions on Systems, Man, and Cybernetics: Systems
5760.062014IEEE International Conference on Robotics and Automation
6750.022014IEEE Transactions on Pattern Analysis and Machine Intelligence
7740.022019IEEE Access
8660.042015IEEE Transactions on Neural Systems and Rehabilitation Engineering
9640.062014Journal of Visual Impairment and Blindness
10630.052015Public Library of Science ONE
Table 2. Top 10 authors of the studied publications from 2014 to 2024.
Table 2. Top 10 authors of the studied publications from 2014 to 2024.
RankNumber of Articles PublishedAuthor
120Kaiwei Wang
218Kailun Yang
314Ruiqi Cheng
410Jinqiang Bai
58Weijian Hu
65Ningbo Long
75Leah Findlater
85Liang-Bi Chen
95Wan-Jung Chang
104Lee Stearns
Table 3. Top 10 co-cited references of the studied publications from 2014 to 2024.
Table 3. Top 10 co-cited references of the studied publications from 2014 to 2024.
RankFreqCentralityYearTitle
1350.022017Magnitude, Temporal Trends, and Projections of the Global Prevalence of Blindness and Distance and Near Vision Impairment: A Systematic Review and Meta-Analysis
2330.162016Navigation Assistance for the Visually Impaired Using RGB-D Sensor With Range Expansion
3280.052017Sensor-Based Assistive Devices for Visually Impaired People: Current Status, Challenges, and Future Directions
4240.062017Enabling Independent Navigation for Visually Impaired People Through a Wearable Vision-Based Feedback System
5200.112017Smart Guiding Glasses for Visually Impaired People in Indoor Environments
6190.082018Safe Local Navigation for Visually Impaired Users With a Time-of-Flight and Haptic Feedback Device
7180.012018Unifying Terrain Awareness for the Visually Impaired Through Real-Time Semantic Segmentation
8180.092016Expanding the Detection of Traversable Area With RealSense for the Visually Impaired
9170.092021World Health Organization, 2021, Blindness and Vision Impairment
10160.042019Vision-Based Mobile Indoor Assistive Navigation Aid for Blind People
Table 4. Top 15 keywords by frequency for the publications studied from 2014 to 2024.
Table 4. Top 15 keywords by frequency for the publications studied from 2014 to 2024.
RankCountCentralityYearKeywords
11510.32014visually impaired
2730.212014computer science
3700.392014assistive technology
4480.212014instruments and instrumentation
5310.12015computer vision
6280.062015wearable device
7260.082014obstacle detection
8220.112015deep learning
9210.092014object recognition
10180.082014obstacle avoidance
11170.142018materials science
12170.092016ultrasonic sensor
13150.092016assistive device
14150.022014object detection
15130.092016wearable system
Table 5. Top 15 keywords by centrality for the studied publications from 2014 to 2024.
Table 5. Top 15 keywords by centrality for the studied publications from 2014 to 2024.
RankCentralityCountYearKeywords
10.39702014assistive technology
20.31512014visually impaired
30.21732014computer science
40.21482014instruments and instrumentation
50.14172018materials science
60.11222015deep learning
70.1172015low vision
80.1312015computer vision
90.09212014object recognition
100.09172016ultrasonic sensor
110.09152016assistive device
120.09132016wearable system
130.08262014obstacle detection
140.08182014obstacle avoidance
150.08122015wearable technology
Table 6. Classification of keyword clusters for the studied publications from 2014 to 2024.
Table 6. Classification of keyword clusters for the studied publications from 2014 to 2024.
Cluster IDSizeSilhouetteYearTop Terms (LSI)
0460.7962018computer science; object recognition; assistive technology; real-time systems; impaired people/smart glasses; obstacle avoidance; assistive devices; collision avoidance; legged locomotion
1310.9462017assistive technology; computer science; wearable computers; navigation device; wearable system/electronic travel aids; computer vision; visual impairment; patent analysis; vision impairment
2250.8922018wearable technology; sport assistance; augmented reality; impaired people; navigation aids/symptom monitoring; medical informatics; cross-section interview; health data; mhealth technology
3230.9212017assistive technology; computer science; wearable device; obstacle detection; electronic travel aids/visual impairment; information technology; daily living; visual odometry; stair detection
4220.9032017object detection; haar transformation; positional analysis; cmos camera; object recognition/visual impairment; convolutional neural; wearable assistive system; ground segmentation; three-dimensional displays
5200.872017obstacle detection; wearable device; assistive device; Bluetooth module; navigation aid/computer science; mobility aid; blind assistive; feature fusion; multilayer gru
6200.792018ultrasonic sensor; audio feedback; visual impairment; object recognition; smart system/object recognition; raspberry pi; computer science; blind people; navigation device
7190.9192018wearable system; computer vision; navigational aid; indoor navigation; radar detection/handicapped aids; radar detection; signal processing algorithm; millimeter wave radar; frequency-modulated continuous wave radar system
8170.7762018human-computer interaction; computer science; wearable computing; augmented vision; sensory augmentation/visual impairments; virtual reality; auditory feedback; assistive wearable devices; collaborative design
9170.8512022materials science; other topics; braille typing system; current signal; blindness assistance/fall detection; indoor navigation monitoring; anti-collision system; blind people support; non-contact triboelectric sensor
10130.9252019machine learning; wireless sensor networks; heuristic algorithms; sensing systems; assistive technology/positioning systems; visual impairment; user acceptance; vehicles opportunities; cognitive map
11130.8072019deep learning; computer science; object detection; mobile edge; wireless communication/assistive technology; crosswalk detection; pedestrian detection; crossing light detection; dead reckoning
12110.8932023deep learning; industrial robots; impedance measurement; surface impedance; terrain classification/legged locomotion; current measurement; humanoid robots; mono-filament issues; deep learning
13512022bio-inspired navigation; visual localization; navigation assistive; devices; artificial place cells; artificial grid cells; artificial head; direction cells
14412023viola jones algorithm; haar cascades; opencv; py-yolov5 algorithm; onnx; model; deep neural networks; eigen faces; principal component; analysis; azure maps; octa-polar segmentation; dimensional ratio; similarity; py-tesseract; paddle ocr
15312017wvns; a combinatorial planner; aiming-tracking mechanism; autoregressive; model; dynamic weighted a*
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Zhang, X.; Huang, X.; Ding, Y.; Long, L.; Li, W.; Xu, X. Advancements in Smart Wearable Mobility Aids for Visual Impairments: A Bibliometric Narrative Review. Sensors 2024, 24, 7986. https://doi.org/10.3390/s24247986

AMA Style

Zhang X, Huang X, Ding Y, Long L, Li W, Xu X. Advancements in Smart Wearable Mobility Aids for Visual Impairments: A Bibliometric Narrative Review. Sensors. 2024; 24(24):7986. https://doi.org/10.3390/s24247986

Chicago/Turabian Style

Zhang, Xiaochen, Xiaoyu Huang, Yiran Ding, Liumei Long, Wujing Li, and Xing Xu. 2024. "Advancements in Smart Wearable Mobility Aids for Visual Impairments: A Bibliometric Narrative Review" Sensors 24, no. 24: 7986. https://doi.org/10.3390/s24247986

APA Style

Zhang, X., Huang, X., Ding, Y., Long, L., Li, W., & Xu, X. (2024). Advancements in Smart Wearable Mobility Aids for Visual Impairments: A Bibliometric Narrative Review. Sensors, 24(24), 7986. https://doi.org/10.3390/s24247986

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop