Next Article in Journal
Optimal COVID-19 Adapted Table Disposition in Hostelry for Guaranteeing the Social Distance through Memetic Algorithms
Next Article in Special Issue
Applications of Smart Helmet in Applied Sciences: A Systematic Review
Previous Article in Journal
A Novel Software Architecture Solution with a Focus on Long-Term IoT Device Security Support
Previous Article in Special Issue
Using Mixed Reality (MR) to Improve On-Site Design Experience in Community Planning
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:

Applications of Smart Glasses in Applied Sciences: A Systematic Review

Department of Energy Resources Engineering, Pukyong National University, Busan 48513, Korea
Author to whom correspondence should be addressed.
Appl. Sci. 2021, 11(11), 4956;
Submission received: 18 April 2021 / Revised: 18 May 2021 / Accepted: 26 May 2021 / Published: 27 May 2021


The aim of this study is to review academic papers on the applications of smart glasses. Among 82 surveyed papers, 57 were selected through filtering. The papers were published from January 2014 to October 2020. Four research questions were set up using the systematic review method, and conclusions were drawn focusing on the research trends by year and application fields; product and operating system; sensors depending on the application purpose; and data visualization, processing, and transfer methods. It was found that the most popular commercial smart glass products are Android-based Google products. In addition, smart glasses are most often used in the healthcare field, particularly for clinical and surgical assistance or for assisting mentally or physically disabled persons. For visual data transfer, 90% of the studies conducted used a camera sensor. Smart glasses have mainly been used to visualize data based on augmented reality, in contrast with the use of mixed reality. The results of this review indicate that research related to smart glasses is steadily increasing, and technological research into the development of smart glasses is being actively conducted.

1. Introduction

With the development of information and communication technology, various forms of wearable devices that can replace smartphones are receiving attention. Wearable devices are defined as “all devices capable of computing that can be worn on the body, including for applications that entail computing functions” [1]. Wearable devices can be used for diverse purposes through the installation of applications available on a mobile operating system (OS), thereby providing various functionalities in addition to those related to fashion and health.
Smart glasses are a type of wearable device that can be worn on the face, and they meet the original objective of enabling clearer vision in addition to functioning as a computer. Since Google released “Google Glass” in 2012, companies such as Sony, Microsoft, and Epson have launched their own smart glass products. Figure 1 shows an example of smart glasses, which provide the desired information to users through a display in the form of external glasses or binoculars. They support wireless communication technologies, such as Bluetooth and Wi-Fi, and can be used to search and share information in real time through an internet connection. In addition, by providing a location tracking function through the Global Positioning System (GPS), it is possible to develop various applications based on location information. As an interface for communication between smart glasses and users, a touch button or a natural language command processing method based on voice recognition is used. By using a camera mounted in front of the device, it is possible to acquire photographs or video data of the surrounding environment in real time [2].
In [3], the concept of virtual reality (VR) was introduced, and in [4], a head-mounted display (HMD) was first presented. Smart glasses are a technology based on optical HMDs (OHMDs), which comprise plastic objects placed at eye level, allowing the user to view an online digital world, an offline world, and the physical world [5]. Unlike smartphones and other wearable devices, smart glasses enable users to conduct tasks without extensive physical effort; for example, it is not necessary to use one’s hands or eyes repeatedly to interact with a smart glass [6]. In addition, smart glasses are used in a variety of ways, including application development. Examples include information visualization, image data analysis, data processing, navigation, information transmission and sharing, and risk detection and warning [7,8,9,10,11,12].
A number of studies have been conducted on smart glass technologies used in various fields, and various review papers have evaluated the suitability of smart glasses for research purposes. Mitrasinovic et al. [13] surveyed relevant studies using specific keywords and search engines to assess the suitability of smart glasses used in the medical field and analyzed and evaluated three specific products. Klinker et al. [14] focused on the technologies used in smart glasses applied to various services, and Hofmann et al. [15] focused on the literature dealing with the ethical issues of smart glasses. In addition, Lee and Hui [16] analyzed the products, technologies, sensors, and other factors considered in prior studies after intensively investigating the input methods employed in the literature related to commercialized smart eyeglasses. Klinker et al. [17] evaluated the suitability of using smart glasses in the medical field and explored the corresponding merits and demerits for healthcare professionals. However, applications of smart glasses in all fields of applied sciences are yet to be reviewed systematically.
The purpose of this study is to examine the current state of research based on smart glasses since 2014 through secondary research, grasp the research trends of smart glasses, and present the direction of future research. A total of 57 research papers were analyzed with regard to their research trends by year and application fields; product and operating system; sensors depending on the application purpose; and data visualization, processing, and transfer methods.

2. Methods

A systematic literature review is a means to identify, evaluate, and interpret all available research relevant to a particular research question, topic area, or phenomenon of interest. There are five main stages for a systematic interpretation. In the first step, research questions (RQs) were constructed to examine the current state of research using smart glasses. The second step was to design a search method for collecting articles that can provide appropriate answers to the above RQs. The third stage established three criteria, excluding the initial stage, to select studies suitable for use in this research. In the fourth stage, the abstracts of the selected papers were read, and it was confirmed whether the research could provide information suitable for the purpose of this study. Finally, the data were extracted, and the related information was classified and structured.
This study constructed four RQs as follows:
  • RQ1: What are the research trends of smart glasses by year and application fields?
  • RQ2: Which product and operating system are widely used in smart glass applications?
  • RQ3: Which sensors are mainly used depending on the purpose of application of the smart glasses?
  • RQ4: Which data visualization, processing, and transfer methods are widely used in such applications?
To understand the current state of research on the use of smart glasses, a research question was set up that suits the purpose of this study. The presented RQ1 shows the current status of research by year in seven fields: computer science, healthcare, education, industry, service, social science, and agriculture. RQ2 identified the smart glass products and operating systems most commonly used in prior studies. Regardless of the research field, RQ3 analyzed the purpose of using smart glasses and classified the predominantly used sensors based on their purpose. RQ4 categorized the data-processing methods and communication technologies used for data processing.

2.1. Search Method

To describe the subject of this review, we derived search keywords that can be investigated without neglecting any research using smart glasses. In this study, “smart glasses” and “head mounted display (HMD)” were selected as the search keywords, and three search engines, i.e., Web of science, Scopus, and Google Scholar, were used to find documents that contain these keywords in titles and abstracts. Papers published between 1 January 2014 and 31 October 2020 focusing on new applications of smart glasses were obtained. A total of 84 candidate papers were secured using the search method described above.

2.2. Selection Criteria

Articles that met the conditions for review were selected using a filter. The first filters applied in this study were “papers that are difficult to secure” or “papers that are linguistically difficult.” As the second filter, the papers were professionally examined, and as the third, the review-type papers were excluded. Through filtering, eight studies for which it was difficult to keep track of what the authors were attempting (not secured through library visits, could not obtain files from the writers, and papers written in Chinese) were ruled out. Three of the research items were excluded because their full texts were unavailable. Next, 14 review-type papers were excluded from multiple products or various studies. Therefore, 57 final papers from the results of the three filters were selected (Figure 2).

3. Results

3.1. Smart Glasses Research Trend

In this study, the number of research papers was determined based on the year of publication to examine the research trends of smart glasses. Figure 3 shows the number of papers published during the last 6 years, from January 2014 to October 2020. Research related to smart glasses, which had been steadily increasing since 2014, declined in 2017. Because of the release of Google Glass in 2017, the number of published papers has skyrocketed and has been steadily increasing as of 2020. Among the 57 papers, 43 were published after 2017. Of the papers published during the past 6 years, 74% were published during the last 3 years.
To analyze the areas in which smart glasses are actively researched, the 57 filtered studies were classified into 7 categories: computer science, healthcare, education, industry, service, social science, and agriculture. During the first stage, we classified them based on seven fields of application, and in the second stage, computer science was classified into human–computer interaction (HCI), object recognition, and visualization, and healthcare was classified into clinical and surgical assistance, disability support, and therapy. In addition, education was divided into nursing, physical training, and physics, and industry was divided into maintenance and safety. Service was divided into culture and tourism and e-commerce, and social science was divided into information security, marketing, and psychology. Finally, agriculture was further categorized as work support (Figure 4).
Of the 57 studies in the first category, 21 were healthcare-related studies, with 36% of the 7 areas being the most active. Computer science was included in 11 cases and social science in 7 cases. Industry had six studies, and service had four, which was the smallest number (Figure 5). In the second stage, the subfields in which smart glasses were employed and the corresponding ratios were analyzed (Figure 6).
The computer science field is developing HCI [18,19,20,21,22,23] as an interaction technology between humans and computers; object recognition [24,25], which is a technology that recognizes what the user sees, based on a camera; and visualization [8,26,27] to visually show data to users (Table 1). There are several research cases in the field of visualization. Riedlinger et al. [27] compared Google’s Tango tablet with Microsoft’s HoloLens smart glasses in the context of their visualization of building information modeling data. For 16 participants, tests were conducted to solve four tasks using two devices, and the evaluations of users, in terms of interior design visualization and visualization of the modeling data, were analyzed. As a result of the analysis, it was found that most users preferred tablets that can enable the sharing of one screen with multiple people. Although smart glasses provided additional hands-free functionality and stability, tablets had a more positive impression on users than smart glasses because of the lack of the feeling of being isolated in the virtual world. Figure 7 shows an augmented reality (AR) view of a space for socializing, which was completed for comparison between tablets and smart glasses. Among the subfields of computer science, HCI had the highest proportion of studies at 55%, followed by visualization at 27%, and object recognition at 18%. To improve the convenience and efficiency using a detailed classification, the research and development of technologies that facilitate interactions between smart glasses and users is actively being pursued, and in particular, it was judged that there is a high level of interest in research on various interactions between users and devices.
The field of healthcare is subdivided into clinical and surgical assistance [7,28,29,30,31,32,33,34,35], the development of technologies to help medical institution workers using smart glasses, disability support technologies [10,36,37,38,39,40,41,42,43] assisting physically and mentally handicapped people, and therapy [44,45,46] that effectively supports the treatment of patients (Table 2). Among them, the following research cases are in the fields of clinical and surgical assistance. van Doormaal et al. [29] evaluated the validity and accuracy of holographic neural navigation (HN) using AR smart glasses. They programmed a nerve-searching system and evaluated the accuracy and feasibility of using the system in an operating room. The conventional neuronavigation fiducial registration error (CN FRE) was measured in a plastic head model with points displayed, and HN and CN FRE were measured in three patients (Figure 8). Although the accuracy of hologram navigation using commercially available smart glasses in the measurement results has not reached a clinically acceptable level, it is possible to improve the accuracy and overcome the problem, and it has been evaluated that AR nerve navigation has significant potential. Clinical and surgical assistance and disability support in the healthcare field were conducted in the same proportion (43%) in 9 of the 21 papers. In addition, three studies were conducted in the therapy field. Using the results of this analysis, smart glasses are evenly applied in the field of healthcare, and are stably assisting users; however, it is difficult to employ smart glasses in a direct evaluation, and, therefore, technological development is required.
The industry field includes technical research on maintenance [47,48], safety [11,49], and work support [50,51] (Table 3). Among them, in the safety field, the following studies have been conducted: Beak et al. [11] developed a smart glass-based wearable personal proximity warning system (PWS) for the safety of pedestrians at construction and mining sites. Smart glasses receive a signal transmitted by a Bluetooth beacon that is attached to a heavy machine or vehicle and that provides a visual warning to the wearer based on the signal strength (Figure 9). Regardless of the direction the pedestrian is looking, all warnings are issued normally over a distance of 10 m or more. In addition, under the same conditions, a smartphone-based PWS or a smart glass-based PWS was used to evaluate the workload index of 10 subjects. Test results of mental, temporal, and physical stress were underestimated when using a smart glass-based PWS. The evaluation results showed that the use of PWS based on smart glasses can free both hands of pedestrians, improve work efficiency, and help enhance the safety of pedestrians at construction and mining sites. In addition to this study, in the field of safety, two studies were conducted on maintenance and work support. As a result of detailed analyses in the industry field, research has been conducted such that the two lenses are not biased in each of the three cases.
The service category was divided into culture and tourism [9,52,53], which enhances the understanding of artwork using a vivid expression method applying smart glasses, and the e-commerce field [54] for the purchasing of products through the internet or PC communication (Table 4). Research has also been conducted on museum visitors in the fields of culture and tourism. To explore the use of smart glasses in museums, Marco et al. [53] designed a glassware prototype that can provide visitors with a new interpretation experience in museums. The effects were tested through field experiments. They designed and implemented a glassware prototype and tested the interaction between glasses and users based on 12 visitors of the robot gallery at the MIT Museum. Participants were observed and interviewed during the experiment. As a result of an analysis based on qualitative research data collected through interviews, it became clear that the use of smart glasses makes it possible to better appreciate art. In this manner, the culture and tourism field accounted for 75% of the research corresponding to the service field; however, it was judged that research on smart glasses in the e-commerce-related service field is still inactive compared to that in the culture and tourism field, owing to security issues.
The education section was divided into nursing [55] for the education of medical workers using AR; physical training [56], such as cycling training; and a general category [57] for future informatization in experimental education and physics [58,59,60,61] to effectively teach physical theories using visual methods (Table 5). Research has been conducted in the field of physics. Strzys et al. [61] conducted experiments to improve the understanding of physical concepts by directly showing invisible physical quantities through AR-based learning (Figure 10). In a metal heat conduction experiment with 59 subjects, the temperature of the object was visualized in color in an AR form directly on the actual object itself. As a result of a questionnaire survey, it was found that MR improved learners’ understanding of basic physics concepts. Therefore, we determined that complex experiments can be easily understood through AR, using detailed analysis results in the field of education. Regarding the remaining fields, four among the seven studies corresponded to the physics field. The visual representation of physical phenomena that cannot be observed was judged to be the most useful for learning physics using smart glasses.
In social science, a study was conducted on user information security associated with the use of smart glasses [62]. A marketing and psychology study was conducted to analyze the factors that influence consumers’ social perception of smart glasses and device selection [63,64,65,66,67,68]. In the field of social sciences (Table 6), six studies (86% of the seven papers) focused on marketing and psychology. It was found that the consumer recognition of smart eyeglasses is increasing with progress in the commercialization of smart eyeglasses.
In the field of agriculture, research has been conducted to improve the performance of livestock farming operations [69]. Caria et al. [69] conducted experiments on knowledge-based extensions on how smart glasses interact with the agricultural environment. Sixteen participants conducted a test in milking parlors to read, identify, and group various types of content through smart glasses (Figure 11). A questionnaire was administered to evaluate the amount of work performed by the device and its ease of use. It was found that smart glasses provided a positive opportunity for livestock management in terms of animal data consultation and information evaluation. Smart glasses can help improve human cognition and usefulness in agricultural fields. It seems that only a few studies have been conducted in this regard. Therefore, it was determined that relatively active research must be conducted when compared to that in other fields.

3.2. Product and Operating System

A wide variety of smart glasses have been used in the 57 studies. The smart glasses used in the research were categorized to understand the most used products among multiple smart glass products (Figure 12). In one study, a single product was used; however, in other studies, multiple products were compared and used. As for the smart glasses used in the research, Google Glass was used the most, in 16 studies, followed by Microsoft HoloLens in 15 studies. Epson’s Movie series was used in eight studies, making it the third most popular among the commercialized products. Beyond commercially available smart glasses, there were four cases in which we made our own smart glasses to suit the research purpose. In addition, although smart glasses were used in this study, eight papers did not include specific product information.
Table 7 presents the results of an analysis of the characteristics and technical levels of the top 4 products that are the most used from among the 12 smart glasses considered. The most widely used, Google Glass Enterprise Edition 2, is a product released with a focus on utility in industrial fields, such as aviation and medicine. Based on the Qualcomm Snapdragon XR1 platform, we focused on maximizing the battery performance and increasing the artificial intelligence processing power while maintaining the shape of common glasses. This product uses a light pipe method to solve the problems of distortion and resolution reduction caused by transmitting information regarding digital images in the optical fluoroscopy method. Sound is transmitted through bone conduction speakers and ear sets, and the device can be operated using voice and a touch pad on the right side of the main body, improving the educational environment for new students in the industrial field and providing patient information in the medical field.
As the product name implies, Microsoft’s HoloLens uses a waveguide-based holographic method of optical fluoroscopy to provide information regarding digital images. The optical fluoroscopy method has problems such as a limited viewing angle; therefore, to overcome these problems, this product has a viewing angle that is more than twice as wide compared with that in previous studies. For this reason, a microelectromechanical system with scanning technology utilizing lasers and mirrors was applied. This product achieves an MR environment that allows interactions similar to touching an actual object based on a depth sensor equipped with artificial intelligence; further, it is not limited to industrial sites, such as construction and maintenance sites, but is also applicable to operating rooms.
Epson’s Moverio BT-350 is a smart glasses device that uses a reflection method applied through a waveguide-based optical fluoroscopy approach for providing information regarding digital images. Epson independently developed and used Epson light-guide technology that transmits light generated from a micro-display through a 1-cm-thick waveguide to the eye through total internal reflection. Unlike Google Glass and HoloLens, a BT-350 can be operated through a controller connected to the main unit by a wire, which can cause discomfort.
Vuzix has sold blade M100 devices to a small number of companies and customers and has since steadily launched the M300, M400, and M4000 devices. This product, unlike other previous products, can be mounted on the frame of the glasses or attached to a headband. Like Google Glass, information regarding digital images can be observed through the screen in either the right or left eye. It recognizes the movement of the hands and fingers through motion sensors, which control the device. A waveguide made of plastic or glass allows the image to move between two surfaces such that the image is projected onto the eye along the interior of the lens, which is provided in the form of two fields of view, i.e., 28° and 40°.
An OS is a software that acts as an intermediary between users and hardware and assists in managing system hardware and running applications. Because smartphones and laptops use an OS, smart glasses also use an OS. The OSs of the smart glasses considered in this research were analyzed to understand the current state of the OSs used in research related to smart glasses. The analysis results show the difference in the OSs used based on the smart glass product (Figure 13). Thirty studies utilized the Android OS, accounting for more than half of the total number (53%). Fifteen studies employed Windows, and six papers did not mention the OS employed even when using smart glasses as devices for simple information acquisition and recording. When an OS was found, however, seven studies did not provide a specific explanation for its use. There was one case in which both the Android and Windows OSs were employed.

3.3. Sensors Depending on the Application Purpose

In this study, smart glasses were found to be used in various fields, although the purpose of their application may be different. Smart glasses have been used for one or more purposes. Therefore, the research was categorized based on the purpose of using the smart glasses, excluding the fields of use (Figure 14). Because smart glasses play the most visual role, most research aims to convey other information to the user’s line of sight. Therefore, 32 studies were performed to convey the information obtained from smart glasses and then to visualize such information. Visualization and information transmission corresponded to 32 and 29 studies, respectively, accounting for 65% of the total utilization purpose. In addition, nine studies were aimed at notifying users of dangers using acquired data and eight studies were aimed at sharing information.
In this study, various sensors built into smart glasses were utilized based on the research aim. Therefore, the sensors used in the research were classified, and the types of sensors applied the most were analyzed. Figure 15 shows the frequency of the sensors used as a percentage. More than half of the studies (45) used camera sensors that are the most suitable for the original role of smart glasses. Smart glasses have a disadvantage in that data entry is inconvenient because there is no input device, such as a keyboard. Therefore, a microphone was used in eight of the studies to solve the input problem. The sensors needed to calculate motion (gyroscope, accelerometer, GPS, and motion) were also used. In addition, sound/audio and vibration sensors were used for information transmission through sound and vibration.
The use of arbitrary sensors was categorized based on the aim of use (Figure 16). As the classification results show, a camera was used for all research purposes. Among the 32 studies aimed at visualization and information transmission, 30 used camera sensors. Among the studies investigated in this study, those aimed at visualization utilized sensors the most, such as gyroscopes, accelerometers, eye trackers, and microphones, compared to studies conducted for other purposes. In addition, 36 studies, excluding 3 out of the 39, aimed at information transmission used camera sensors to obtain and transmit information, and multiple sensors, such as a GPS, motion sensor, and infrared device, were used.

3.4. Data Visualization, Processing, and Transfer Methods

Smart glasses visualize information using augmented reality (AR) or mixed reality (MR) methods. AR enhances reality and provides additional information based on the real world, showing a virtual image against the background of the real world. Moreover, MR is a technology that arranges virtual objects in a real space with AR (displaying information against the background of reality with a technology that mixes augmented reality) in virtual reality, and constructs a virtual space based on reality. In other words, it portrays an environment with added computer graphics that can interact with reality in an actual environment. Although it is similar to AR because it is based on reality, it can be differentiated in terms of interaction with reality [74,75]. Therefore, most smart glasses use AR or MR technology to obtain or use information. As the classification results indicate, a total 52 of the 57 studies considered (91.2%) used AR technology, and only 2 used MR technology. There were also two studies that did not use AR or MR and applied smart glasses only for data acquisition (Table 8).
The data used for smart glasses are varied, such as visual data, numerical data, and positional data. The methods used to process the data were separated by cloud, local, and unused studies. In 23 of the studies, the data in the device were saved and sent to the cloud without being processed directly, which was 40% of the total number of studies. By contrast, in 47% of the studies, the acquired data were used in the local format, which can be processed in the device. Therefore, there were more studies on local storage than on cloud storage. When using smart glasses to display only visually without data acquisition, there were no data processes. There were five studies on devices in which data were not processed (9%), and for two studies (4%) the exact method was unknown (Figure 17).
Wireless technology is required to transfer data. Therefore, wireless technology was used in studies that require a data transfer. The communication technology used differed based on the product used in the research. Of the 57 studies, 35 used devices that can apply Wi-Fi, and 26 used Bluetooth to transfer data. Regarding the use of an independently produced device, two studies were conducted for processing data using a cable. There were five cases in which the data were processed in the device without sharing the data, and communication technology was not used.

4. Discussion

Systematic reviews were conducted to examine the research trends of smart glasses and analyze them based on the research aim. The answers to the RQs obtained through the analysis are summarized as follows:
  • RQ 1: Research related to smart glasses has been actively conducted since the launch of Google Glass in 2017. This development has been the most researched in the field of healthcare, focusing on areas of clinical and surgical assistance for healthcare professionals or assisting persons with mental or physical disabilities.
  • RQ 2: Google Glass products were used the most, followed by Microsoft’s HoloLens. Google Glass is an Android-based product, and HoloLens is a Windows-based product. Thus, Android OS is used the most, followed by Windows.
  • RQ 3: The purpose of the studies was mostly to determine ways to visually represent information using smart glasses, and transmitting data acquired through smart glasses was the most frequently described. A camera sensor was the most commonly used device.
  • RQ 4: Most of the acquired data appeared in the form of AR. The acquired data are mainly processed in the device, and Wi-Fi is the most frequently used data transmission method.
The advantages of smart glasses are summarized as follows:
  • Cognitive behavior through an optical fluoroscopic display allows the user to better recognize the context of the information when it is received, providing a more immersive experience (compared to mobile devices) [76,77].
  • Users can view digital images from the front without switching their line of sight to their smartphones.
  • Visual and head direction sensor technology that can confirm the direction of the user’s interest provides information based on the more elaborate head orientation and position, as well as a more immersive and personalized experience.
  • Owing to wireless interaction between the device and the user (e.g., voice recognition and air motion), both hands can be used freely without buttons or touchpads.
  • There is no need to remember difficult or complicated processes in the field, and it is possible to learn accurately and quickly through virtual videos provided in front of users, improving work efficiency [53].
  • Efficiency can be improved through these advantages.
Although smart glasses are equipped with various functions, such as sensors and batteries, they are still insufficient in terms of weight and design, and more studies need to be commercialized. For smart glasses to be commercialized, it is necessary to discuss the related laws and regulations in terms of price and privacy invasion. Several limitations of smart glasses are summarized as follows:
  • Cameras and GPS-like sensors attached to smart glasses have potential elements of privacy breaches.
  • With sensors and batteries to improve functionality, effort is required to achieve a universal smart glass design.
  • According to the user environment, it is necessary to consider the appropriate interaction technology (e.g., speech recognition or air motion).
  • Smart glasses can be used in reading visual media and articles with long texts, and can technically complement works of fiction, such as novels.
  • We must continually develop technologies that can modify existing types of content used in mobile apps and implement dynamic forms of content allowing apps on smartphones to be used successfully when applied in AR [52].
  • Next-generation smart glass applications must adopt specific strategies for tailoring content to specific types of smart glasses.

5. Conclusions

There has been increasing interest in the use of smart glasses in specific areas, such as daily life, education, and games. In response to these trends, there is a great deal of interest in utilizing smart glasses specialized in fields such as industry, agriculture, and medicine. The effective use of smart glasses requires the development of customized technologies and product strategies, such as systems and applications for smart glass-based applications. To avoid the use of both hands, and achieve technical perfection and accuracy without the use of buttons, not only voice commands but also commands input through air motions and finger tabs via sensors built into glasses need to increase.
In the future, more stable and effective use will be possible upon addressing problems through continuous technology development, such as performance improvement and weight reduction, design and interface implementation, battery life extension, and security improvements. In particular, it is expected that improving the productivity and safety of a site will be possible when smart glasses are introduced into the field at outdoor sites.

Author Contributions

Conceptualization, Y.C.; methodology, Y.C.; software, D.K.; validation, D.K.; formal analysis, D.K.; investigation, Y.C.; resources, Y.C.; data curation, D.K.; writing—original draft preparation, D.K.; writing—review and editing, Y.C.; visualization, D.K.; supervision, Y.C.; project administration, Y.C.; funding acquisition, Y.C. Both authors have read and agreed to the published version of the manuscript.


This work was supported by the KETEP grant funded by the Korea Government’s Ministry of Trade, Industry and Energy (project no. 20206110100030).

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest.


  1. Kang, J.Y. Study on the Content Design for Wearable Device—Focus on User Centered Wearable Infotainment Design. J. Digit. Des. 2015, 15, 325–333. [Google Scholar]
  2. Michalski, R.S.; Carbonell, J.G.; Mitchell, T.M. Machine learning: An artificial intelligence approach. Artif. Intell. 1985, 25, 236–238. [Google Scholar] [CrossRef]
  3. Sutherland, I.E. The Ultimate Display. In Proceedings of the IFIP Congress, New York, NY, USA, 24–28 May 1965; pp. 506–508. [Google Scholar]
  4. Sutherland, I.E. A head-mounted three dimensional display. In Proceedings of the AFIPS 68, San Francisco, CA, USA, 9–11 December 1968. [Google Scholar]
  5. Kress, B.; Starner, T. A Review of Head-Mounted Displays (HMD) Technologies and Applications for Consumer Electronics. In Photonic Applications for Aerospace, Commercial, and Harsh Environments IV; International Society for Optics and Photonics: Bellingham, WA, USA, 2013; Volume 8720, p. 87200A. [Google Scholar]
  6. Due, B.L. The Future of Smart Glasses: An Essay about Challenges and Possibilities with Smart Glasses; Centre of Interaction Research and Communication Design, University of Copenhagen: København, Denmark, 2014; Volume 1, pp. 1–21. [Google Scholar]
  7. Seifabadi, R.; Li, M.; Long, D.; Xu, S.; Wood, B.J. Accuracy Study of Smartglasses/Smartphone AR Systems for Percutaneous Needle Interventions. In Proceedings of the SPIE Medical Imaging, Houston, TX, USA, 16 March 2020; Volume 11315. [Google Scholar] [CrossRef]
  8. Aiordǎchioae, A.; Vatavu, R.-D. Life-Tags: A Smartglasses-Based System for Recording and Abstracting Life with Tag Clouds. ACM Hum. Comput. Interact. 2019, 3, 1–22. [Google Scholar] [CrossRef]
  9. Han, D.-I.D.; Tom Dieck, M.C.; Jung, T. Augmented Reality Smart Glasses (ARSG) Visitor Adoption in Cultural Tourism. Leis. Stud. 2019, 38, 618–633. [Google Scholar] [CrossRef]
  10. Miller, A.; Malasig, J.; Castro, B.; Hanson, V.L.; Nicolau, H.; Brandão, A. The Use of Smart Glasses for Lecture Comprehension by Deaf and Hard of Hearing Students. In Proceedings of the 2017 CHI Conference Extended Abstracts on Human Factors in Computing Systems, CHI EA ’17, Denver, CO, USA, 6–11 May 2017; pp. 1909–1915. [Google Scholar] [CrossRef] [Green Version]
  11. Baek, J.; Choi, Y. Smart Glasses-Based Personnel Proximity Warning System for Improving Pedestrian Safety in Construction and Mining Sites. Int. J. Environ. Res. Public Health 2020, 17, 1422. [Google Scholar] [CrossRef] [Green Version]
  12. Aiordachioae, A.; Schipor, O.-A.; Vatavu, R.-D. An Inventory of Voice Input Commands for Users with Visual Impairments and Assistive Smartglasses Applications. In Proceedings of the 2020 International Conference on Development and Application Systems (DAS), Suceava, Romania, 21–23 May 2020; pp. 146–150. [Google Scholar] [CrossRef]
  13. Mitrasinovic, S.; Camacho, E.; Trivedi, N.; Logan, J.; Campbell, C.; Zilinyi, R.; Lieber, B.; Bruce, E.; Taylor, B.; Martineau, D. Clinical and Surgical Applications of Smart Glasses. Technol. Health Care 2015, 23, 381–401. [Google Scholar] [CrossRef]
  14. Klinker, K.; Berkemeier, L.; Zobel, B.; Wüller, H.; Huck, V.; Wiesche, M.; Remmers, H.; Thomas, O.; Krcmar, H. Structure for Innovations: A Use Case Taxonomy for Smart Glasses in Service Processes. In Proceedings of the Multikonferenz Wirtschaftsinformatik 2018, Lüneburg, Germany, 6–9 March 2018; pp. 1599–1610. [Google Scholar]
  15. Hofmann, B.; Haustein, D.; Landeweerd, L. Smart-Glasses: Exposing and Elucidating the Ethical Issues. Sci. Eng. Ethics 2017, 23, 701–721. [Google Scholar] [CrossRef]
  16. Lee, L.-H.; Hui, P. Interaction Methods for Smart Glasses: A Survey. IEEE Access 2018, 6, 28712–28732. [Google Scholar] [CrossRef]
  17. Klinker, K.; Obermaier, J.; Wiesche, M. Conceptualizing passive trust: The case of smart glasses in healthcare. In Proceedings of the 27th European Conference on Information Systems (ECIS), Stockholm & Uppsala, Sweden, 8–14 June 2019; p. 12. [Google Scholar]
  18. Belkacem, I.; Pecci, I.; Martin, B.; Faiola, A. TEXTile: Eyes-Free Text Input on Smart Glasses Using Touch Enabled Textile on the Forearm. In Human-Computer Interaction—INTERACT 2019; Lamas, D., Loizides, F., Nacke, L., Petrie, H., Winckler, M., Zaphiris, P., Eds.; Computer Science, Lecture Notes; Springer International Publishing: Cham, Switzerland, 2019; pp. 351–371. [Google Scholar] [CrossRef]
  19. Zhang, L.; Li, X.-Y.; Huang, W.; Liu, K.; Zong, S.; Jian, X.; Feng, P.; Jung, T.; Liu, Y. It Starts with Igaze: Visual Attention Driven Networking with Smart Glasses. In Proceedings of the 20th Annual International Conference on Mobile Computing and Networking, Maui, HI, USA, 7–11 September 2014; pp. 91–102. [Google Scholar]
  20. Lee, L.H.; Yung Lam, K.; Yau, Y.P.; Braud, T.; Hui, P. HIBEY: Hide the Keyboard in Augmented Reality. In Proceedings of the 2019 IEEE International Conference on Pervasive Computing and Communications (PerCom), Kyoto, Japan, 11–15 March 2019. [Google Scholar] [CrossRef]
  21. Lee, L.H.; Braud, T.; Bijarbooneh, F.H.; Hui, P. Tipoint: Detecting Fingertip for Mid-Air Interaction on Computational Resource Constrained Smartglasses. In Proceedings of the 23rd ACM Annual International Symposium on Wearable Computers (ISWC2019), London, UK, 11–13 September 2019; pp. 118–122. [Google Scholar] [CrossRef]
  22. Lee, L.H.; Braud, T.; Lam, K.Y.; Yau, Y.P.; Hui, P. From Seen to Unseen: Designing Keyboard-Less Interfaces for Text Entry on the Constrained Screen Real Estate of Augmented Reality Headsets. Pervasive Mob. Comput. 2020, 64, 101148. [Google Scholar] [CrossRef]
  23. Lee, L.H.; Braud, T.; Bijarbooneh, F.H.; Hui, P. UbiPoint: Towards Non-Intrusive Mid-Air Interaction for Hardware Constrained Smart Glasses. In Proceedings of the 11th ACM Multimedia Systems Conference, Istambul, Turkey, 8–11 June 2020; pp. 190–201. [Google Scholar] [CrossRef]
  24. Park, K.-B.; Kim, M.; Choi, S.H.; Lee, J.Y. Deep Learning-Based Smart Task Assistance in Wearable Augmented Reality. Robot. Comput. Integr. Manuf. 2020, 63, 101887. [Google Scholar] [CrossRef]
  25. Chauhan, J.; Asghar, H.J.; Mahanti, A.; Kaafar, M.A. Gesture-Based Continuous Authentication for Wearable Devices: The Smart Glasses Use Case. In Applied Cryptography and Network Security; Manulis, M., Sadeghi, A.-R., Schneider, S., Eds.; Computer Science, Lecture Notes; Springer International Publishing: Cham, Switzerland, 2016; pp. 648–665. [Google Scholar] [CrossRef]
  26. Pamparau, C.I. A System for Hierarchical Browsing of Mixed Reality Content in Smart Spaces. In Proceedings of the 2020 International Conference on Development and Application Systems (DAS), Suceava, Romania, 21–23 May 2020; pp. 194–197. [Google Scholar] [CrossRef]
  27. Riedlinger, U.; Oppermann, L.; Prinz, W. Tango vs. Hololens: A Comparison of Collaborative Indoor AR Visualisations Using Hand-Held and Hands-Free Devices. Multimodal Technol. Interact. 2019, 3, 23. [Google Scholar] [CrossRef] [Green Version]
  28. Yong, M.; Pauwels, J.; Kozak, F.K.; Chadha, N.K. Application of Augmented Reality to Surgical Practice: A Pilot Study Using the ODG R7 Smartglasses. Clin. Otolaryngol. 2020, 45, 130–134. [Google Scholar] [CrossRef]
  29. van Doormaal, T.P.C.; van Doormaal, J.A.M.; Mensink, T. Clinical Accuracy of Holographic Navigation Using Point-Based Registration on Augmented-Reality Glasses. Oper. Neurosurg. 2019, 17, 588–593. [Google Scholar] [CrossRef] [Green Version]
  30. Salisbury, J.P.; Keshav, N.U.; Sossong, A.D.; Sahin, N.T. Concussion Assessment with Smartglasses: Validation Study of Balance Measurement toward a Lightweight, Multimodal, Field-Ready Platform. JMIR mHealth uHealth 2018, 6, e15. [Google Scholar] [CrossRef]
  31. Maruyama, K.; Watanabe, E.; Kin, T.; Saito, K.; Kumakiri, A.; Noguchi, A.; Nagane, M.; Shiokawa, Y. Smart Glasses for Neurosurgical Navigation by Augmented Reality. Oper. Neurosurg. 2018, 15, 551–556. [Google Scholar] [CrossRef]
  32. Klueber, S.; Wolf, E.; Grundgeiger, T.; Brecknell, B.; Mohamed, I.; Sanderson, P. Supporting Multiple Patient Monitoring with Head-Worn Displays and Spearcons. Appl. Ergon. 2019, 78, 86–96. [Google Scholar] [CrossRef]
  33. García-Cruz, E.; Bretonnet, A.; Alcaraz, A. Testing Smart Glasses in Urology: Clinical and Surgical Potential Applications. Actas Urológicas Españolas 2018, 42, 207–211. [Google Scholar] [CrossRef]
  34. Ruminski, J.; Bujnowski, A.; Kocejko, T.; Andrushevich, A.; Biallas, M.; Kistler, R. The Data Exchange between Smart Glasses and Healthcare Information Systems Using the HL7 FHIR Standard. In Proceedings of the 2016 9th International Conference on Human System Interactions (HSI), Portsmouth, UK, 6–8 July 2016; pp. 525–531. [Google Scholar]
  35. Nag, A.; Haber, N.; Voss, C.; Tamura, S.; Daniels, J.; Ma, J.; Chiang, B.; Ramachandran, S.; Schwartz, J.; Winograd, T. Toward Continuous Social Phenotyping: Analyzing Gaze Patterns in an Emotion Recognition Task for Children with Autism through Wearable Smart Glasses. J. Med. Internet Res. 2020, 22, e13810. [Google Scholar] [CrossRef]
  36. Rowe, F. A Hazard Detection and Tracking System for People with Peripheral Vision Loss Using Smart Glasses and Augmented Reality. Int. J. Adv. Comput. Sci. Appl. 2019, 10, 1–9. [Google Scholar]
  37. Meza-de-Luna, M.E.; Terven, J.R.; Raducanu, B.; Salas, J. A Social-Aware Assistant to Support Individuals with Visual Impairments during Social Interaction: A Systematic Requirements Analysis. Int. J. Hum. Comput. Stud. 2019, 122, 50–60. [Google Scholar] [CrossRef]
  38. Schipor, O.; Aiordăchioae, A. Engineering Details of a Smartglasses Application for Users with Visual Impairments. In Proceedings of the 2020 International Conference on Development and Application Systems (DAS), Suceava, Romania, 21–23 May 2020; pp. 157–161. [Google Scholar] [CrossRef]
  39. Chang, W.; Chen, L.; Hsu, C.; Chen, J.; Yang, T.; Lin, C. MedGlasses: A Wearable Smart-Glasses-Based Drug Pill Recognition System Using Deep Learning for Visually Impaired Chronic Patients. IEEE Access 2020, 8, 17013–17024. [Google Scholar] [CrossRef]
  40. Lausegger, G.; Spitzer, M.; Ebner, M. OmniColor—A Smart Glasses App to Support Colorblind People. Int. J. Interact. Mob. Technol. 2017, 11, 161–177. [Google Scholar] [CrossRef] [Green Version]
  41. Janssen, S.; Bolte, B.; Nonnekes, J.; Bittner, M.; Bloem, B.R.; Heida, T.; Zhao, Y.; van Wezel, R.J.A. Usability of Three-Dimensional Augmented Visual Cues Delivered by Smart Glasses on (Freezing of) Gait in Parkinson’s Disease. Front. Neurol. 2017, 8, 279. [Google Scholar] [CrossRef] [Green Version]
  42. Sandnes, F.E. What Do Low-Vision Users Really Want from Smart Glasses? Faces, Text and Perhaps No Glasses at All. In Computers Helping People with Special Needs; Miesenberger, K., Bühler, C., Penaz, P., Eds.; Computer Science, Lecture Notes; Springer International Publishing: Cham, Switzerland, 2016; pp. 187–194. [Google Scholar] [CrossRef]
  43. Ruminski, J.; Smiatacz, M.; Bujnowski, A.; Andrushevich, A.; Biallas, M.; Kistler, R. Interactions with Recognized Patients Using Smart Glasses. In Proceedings of the 2015 8th International Conference on Human System Interaction (HSI), Warsaw, Poland, 25–27 June 2015; pp. 187–194. [Google Scholar]
  44. Machado, E.; Carrillo, I.; Saldana, D.; Chen, F.; Chen, L. An Assistive Augmented Reality-Based Smartglasses Solution for Individuals with Autism Spectrum Disorder. In Proceedings of the 2019 IEEE Intl Conf on Dependable, Autonomic and Secure Computing, Intl Conf on Pervasive Intelligence and Computing, Intl Conf on Cloud and Big Data Computing, Intl Conf on Cyber Science and Technology Congress (DASC/PiCom/CBDCom/CyberSciTech), Fukuoka, Japan, 5–8 August 2019; pp. 245–249. [Google Scholar]
  45. Liu, R.; Salisbury, J.P.; Vahabzadeh, A.; Sahin, N.T. Feasibility of an Autism-Focused Augmented Reality Smartglasses System for Social Communication and Behavioral Coaching. Front. Pediatr. 2017, 5, 145. [Google Scholar] [CrossRef] [Green Version]
  46. Vahabzadeh, A.; Keshav, N.U.; Abdus-Sabur, R.; Huey, K.; Liu, R.; Sahin, N.T. Improved Socio-Emotional and Behavioral Functioning in Students with Autism Following School-Based Smartglasses Intervention: Multi-Stage Feasibility and Controlled Efficacy Study. Behav. Sci. 2018, 8, 85. [Google Scholar] [CrossRef] [Green Version]
  47. Wolfartsberger, J.; Zenisek, J.; Wild, N. Data-Driven Maintenance: Combining Predictive Maintenance and Mixed Reality-Supported Remote Assistance. Procedia Manuf. 2020, 45, 307–312. [Google Scholar] [CrossRef]
  48. Siltanen, S.; Heinonen, H. Scalable and Responsive Information for Industrial Maintenance Work: Developing XR Support on Smart Glasses for Maintenance Technicians. In Proceedings of the 23rd International Conference on Academic Mindtrek, AcademicMindtrek ’20, Tampere, Finland, 29–30 January 2020; pp. 100–109. [Google Scholar] [CrossRef]
  49. Chang, W.-J.; Chen, L.-B.; Chiou, Y.-Z. Design and Implementation of a Drowsiness-Fatigue-Detection System Based on Wearable Smart Glasses to Increase Road Safety. IEEE Trans. Consum. Electron. 2018, 64, 461–469. [Google Scholar] [CrossRef]
  50. Kirks, T.; Jost, J.; Uhlott, T.; Püth, J.; Jakobs, M. Evaluation of the Application of Smart Glasses for Decentralized Control Systems in Logistics. In Proceedings of the 2019 IEEE Intelligent Transportation Systems Conference (ITSC), Auckland, New Zealand, 27–30 October 2019; pp. 4470–4476. [Google Scholar]
  51. Gensterblum, C. An Analysis of Smart Glasses and Commercial Drones’ Ability to Become Disruptive Technologies in the Construction Industry. Ph.D. Thesis, Kalamazoo College, Kalamazoo, MI, USA, 1 January 2020. [Google Scholar]
  52. tom Dieck, M.C.; Jung, T.; Han, D.I. Mapping Requirements for the Wearable Smart Glasses Augmented Reality Museum Application. J. Hosp. Tour. Technol. 2016, 7, 230–253. [Google Scholar] [CrossRef]
  53. Mason, M. The MIT Museum Glassware Prototype: Visitor Experience Exploration for Designing Smart Glasses. J. Comput. Cult. Herit. 2016, 9, 12:1–12:28. [Google Scholar] [CrossRef]
  54. Ho, C.C.; Tseng, B.-Y.; Ho, M.-C. Typingless Ticketing Device Input by Automatic License Plate Recognition Smartglasses. ICIC Express Lett. Part B Appl. 2018, 9, 325–330. [Google Scholar]
  55. Kopetz, J.P.; Wessel, D.; Jochems, N. User-Centered Development of Smart Glasses Support for Skills Training in Nursing Education. i-com 2019, 18, 287–299. [Google Scholar] [CrossRef]
  56. Berkemeier, L.; Menzel, L.; Remark, F.; Thomas, O. Acceptance by Design: Towards an Acceptable Smart Glasses-Based Information System Based on the Example of Cycling Training. In Proceedings of the Multikonferenz Wirtschaftsinformatik, Lüneburg, Germany, 6–9 March 2018; p. 12. [Google Scholar]
  57. Cao, Y.; Tang, Y.; Xie, Y. A Novel Augmented Reality Guidance System for Future Informatization Experimental Teaching. In Proceedings of the 2018 IEEE International Conference on Teaching, Assessment, and Learning for Engineering (TALE), Wollongong, Australia, 4–7 December 2019; pp. 900–905. [Google Scholar] [CrossRef]
  58. Spitzer, M.; Nanic, I.; Ebner, M. Distance Learning and Assistance Using Smart Glasses. Educ. Sci. 2018, 8, 21. [Google Scholar] [CrossRef] [Green Version]
  59. Kapp, S.; Thees, M.; Strzys, M.P.; Beil, F.; Kuhn, J.; Amiraslanov, O.; Javaheri, H.; Lukowicz, P.; Lauer, F.; Rheinländer, C.; et al. Augmenting Kirchhoff’s Laws: Using Augmented Reality and Smartglasses to Enhance Conceptual Electrical Experiments for High School Students. Phys. Teach. 2018, 57, 52–53. [Google Scholar] [CrossRef]
  60. Thees, M.; Kapp, S.; Strzys, M.P.; Beil, F.; Lukowicz, P.; Kuhn, J. Effects of Augmented Reality on Learning and Cognitive Load in University Physics Laboratory Courses. Comput. Hum. Behav. 2020, 108, 106316. [Google Scholar] [CrossRef]
  61. Strzys, M.P.; Kapp, S.; Thees, M.; Klein, P.; Lukowicz, P.; Knierim, P.; Schmidt, A.; Kuhn, J. Physics Holo.Lab Learning Experience: Using Smartglasses for Augmented Reality Labwork to Foster the Concepts of Heat Conduction. Eur. J. Phys. 2018, 39, 035703. [Google Scholar] [CrossRef] [Green Version]
  62. Rauschnabel, P.A.; He, J.; Ro, Y.K. Antecedents to the Adoption of Augmented Reality Smart Glasses: A Closer Look at Privacy Risks. J. Bus. Res. 2018, 92, 374–384. [Google Scholar] [CrossRef]
  63. Adapa, A.; Nah, F.F.-H.; Hall, R.H.; Siau, K.; Smith, S.N. Factors Influencing the Adoption of Smart Wearable Devices. Int. J. Hum. Comput. Interact. 2018, 34, 399–409. [Google Scholar] [CrossRef]
  64. Rauschnabel, P.A.; Brem, A.; Ivens, B.S. Who Will Buy Smart Glasses? Empirical Results of Two Pre-Market-Entry Studies on the Role of Personality in Individual Awareness and Intended Adoption of Google Glass Wearables. Comput. Hum. Behav. 2015, 49, 635–647. [Google Scholar] [CrossRef]
  65. Rauschnabel, P.A.; Hein, D.W.E.; He, J.; Ro, Y.K.; Rawashdeh, S.; Krulikowski, B. Fashion or Technology? A Fashnology Perspective on the Perception and Adoption of Augmented Reality Smart Glasses. i-com 2016, 15, 179–194. [Google Scholar] [CrossRef]
  66. Rallapalli, S.; Ganesan, A.; Chintalapudi, K.; Padmanabhan, V.N.; Qiu, L. Enabling Physical Analytics in Retail Stores Using Smart Glasses. In Proceedings of the 20th Annual International Conference on Mobile Computing and Networking, Maui, HI, USA, 7–11 September 2014; pp. 115–126. [Google Scholar]
  67. Hoogsteen, K.M.P.; Osinga, S.A.; Steenbekkers, B.L.P.A.; Szpiro, S.F.A. Functionality versus Inconspicuousness: Attitudes of People with Low Vision towards OST Smart Glasses. In Proceedings of the 22nd International ACM SIGACCESS Conference on Computers and Accessibility, Virtual Event, Greece, 26–28 October 2020; pp. 1–4. [Google Scholar]
  68. Ok, A.E.; Basoglu, N.A.; Daim, T. Exploring the Design Factors of Smart Glasses. In Proceedings of the 2015 Portland International Conference on Management of Engineering and Technology (PICMET), Portland, OR, USA, 2–6 August 2015; pp. 1657–1664. [Google Scholar]
  69. Caria, M.; Todde, G.; Sara, G.; Piras, M.; Pazzona, A. Performance and Usability of Smartglasses for Augmented Reality in Precision Livestock Farming Operations. Appl. Sci. 2020, 10, 2318. [Google Scholar] [CrossRef] [Green Version]
  70. Vuzix. Available online: (accessed on 29 March 2021).
  71. Google Enterprise Edition Help. Available online: (accessed on 29 March 2021).
  72. Epson. Available online: (accessed on 29 March 2021).
  73. Microsoft. Available online: (accessed on 29 March 2021).
  74. Gintautas, V.; Hübler, A.W. Experimental Evidence for Mixed Reality States in an Interreality System. Phys. Rev. E 2007, 75, 057201. [Google Scholar] [CrossRef] [Green Version]
  75. Milgram, P.; Kishino, F. A Taxonomy of Mixed Reality Visual Displays. IEICE Trans. Inf. Syst. 1994, E77-D, 1321–1329. [Google Scholar]
  76. Kiyokawa, K. An Introduction to Head-Mounted Displays for Augmented Reality. In Emerging Technologies of Augmented Reality: Interfaces and Design; IGI Global: Hershey, PA, USA, 2007; pp. 43–63. [Google Scholar]
  77. Mason, M. The dimensions of the mobile visitor experience: Thinking beyond the technology design. Int. J. Incl. Mus. 2012, 5, 51–72. [Google Scholar] [CrossRef]
Figure 1. Example of smart glasses (Moverio BT-350, Epson, Suwa, Japan).
Figure 1. Example of smart glasses (Moverio BT-350, Epson, Suwa, Japan).
Applsci 11 04956 g001
Figure 2. Flow chart of selection criteria.
Figure 2. Flow chart of selection criteria.
Applsci 11 04956 g002
Figure 3. Publications per year. Number of articles per year between January 2014 and October 2020.
Figure 3. Publications per year. Number of articles per year between January 2014 and October 2020.
Applsci 11 04956 g003
Figure 4. Structure of applied research fields and subfields.
Figure 4. Structure of applied research fields and subfields.
Applsci 11 04956 g004
Figure 5. Number of studies based on a study of the application fields.
Figure 5. Number of studies based on a study of the application fields.
Applsci 11 04956 g005
Figure 6. Detailed distribution of smart glass applications.
Figure 6. Detailed distribution of smart glass applications.
Applsci 11 04956 g006
Figure 7. Location of user study: renovated office area with available building information modeling data regarding the building itself as well as the furniture. Top, real furniture after the renovation; bottom, augmented reality visualization during construction [27].
Figure 7. Location of user study: renovated office area with available building information modeling data regarding the building itself as well as the furniture. Top, real furniture after the renovation; bottom, augmented reality visualization during construction [27].
Applsci 11 04956 g007
Figure 8. (A) MRI patient 1. Left-sided mesiotemporal epidermoid. (B) Hologram patient 1 including MRI [29].
Figure 8. (A) MRI patient 1. Left-sided mesiotemporal epidermoid. (B) Hologram patient 1 including MRI [29].
Applsci 11 04956 g008
Figure 9. Conceptual view of the personal proximity warning system (PWS) comprising a Bluetooth low energy (BLE) receiver unit and BLE transmitter units [11].
Figure 9. Conceptual view of the personal proximity warning system (PWS) comprising a Bluetooth low energy (BLE) receiver unit and BLE transmitter units [11].
Applsci 11 04956 g009
Figure 10. Experimental setup (rod with PVC insulation) and user wearing a HoloLens [61].
Figure 10. Experimental setup (rod with PVC insulation) and user wearing a HoloLens [61].
Applsci 11 04956 g010
Figure 11. (a) Sheep information in text and graphic format and (b) a participant reading information regarding an animal on GlassUp F4 smart glasses (F4SG) while milking [69].
Figure 11. (a) Sheep information in text and graphic format and (b) a participant reading information regarding an animal on GlassUp F4 smart glasses (F4SG) while milking [69].
Applsci 11 04956 g011
Figure 12. Frequency of use based on model of smart glasses.
Figure 12. Frequency of use based on model of smart glasses.
Applsci 11 04956 g012
Figure 13. Frequency of use of operating systems.
Figure 13. Frequency of use of operating systems.
Applsci 11 04956 g013
Figure 14. Number of studies based on research purpose.
Figure 14. Number of studies based on research purpose.
Applsci 11 04956 g014
Figure 15. Frequency and ratio of each sensor used in the study.
Figure 15. Frequency and ratio of each sensor used in the study.
Applsci 11 04956 g015
Figure 16. Use of sensors according to the purpose of using smart glasses.
Figure 16. Use of sensors according to the purpose of using smart glasses.
Applsci 11 04956 g016
Figure 17. Number and ratio of studies by data-processing method of smart glasses.
Figure 17. Number and ratio of studies by data-processing method of smart glasses.
Applsci 11 04956 g017
Table 1. Summary of smart glass applications in the field of computer science.
Table 1. Summary of smart glass applications in the field of computer science.
Sub-FieldReferencesYearAim of Study
(Human–Computer Interaction)
Belkacem et al. [18]2019Inputting text into smart glasses using touch fiber
Zhang et al. [19]2014Exploring the user’s visual interest through smart glasses
Lee et al. [20]2019Development of technology to enter text through fingertip detection technology in the air without using a keyboard
Lee et al. [21]2019
Lee et al. [22]2020
Lee et al. [23]2020
Object recognitionPark et al. [24]2020Effective object detection and object segmentation verification through wearable AR technology based on deep learning technology
Chauhan et al. [25]2016Verification of biometric accuracy of smart glasses
VisualizationParmparau [26]2020Development of technology to visualize digital contents
Riedlinger et al. [27]2019Visualization comparison of hands-free (smart glass) versus non-hands-free (tablet) devices
Aiordǎchioae et al. [8]2019Data cloud formation through tags extracted from images recorded with cameras in smart glasses
Table 2. Summary of smart glass applications in the field of healthcare.
Table 2. Summary of smart glass applications in the field of healthcare.
Sub-FieldReferencesYearAim of Study
Clinical and Surgical assistance Yong et al. [28]2020Enhancing the learning experience for trainees practicing microsurgery techniques for science and surgery
van Doormaal et al. [29]2019To determine feasibility and accuracy of holographic neuronavigation (HN) using AR smart glasses.
Salisbury et al. [30]2018Objective measurement of concussion-related disorders through smart glasses
Maruyama et al. [31]2018Assessing the accuracy of visualizing 3D graphics in neurosurgery surgery
Klueber et al. [32]2019Multiple patient monitoring
García-Cruz et al. [33]2018Potential evaluation of smart eyeglasses in urology
Ruminski et al. [34]2016Data exchange between medical information systems
Salisbury et al. [7]2020Verification of the AR system accuracy of smart glasses in percutaneous needle interventions surgery
Nag et al. [35]2020Analysis of emotional awareness of autistic children
Disability supportRowe [36]2019Assisting people with peripheral vision loss through a risk detection and tracking system
Meza-de-Luna et al. [37]2019Supporting social interaction (conversation) of people with visual impairments
Schipor et al. [38]2020Improved convenience for visually impaired users through voice command input and control
Chang et al. [39]2020Development of a system to improve the safety of the visually impaired when using drugs
Lausegger et al. [40]2017Visual aid for people with color blindness
Miller et al. [10]2017To improve the understanding of lectures for the hearing impaired
Janssen et al. [41]2017Validation of gait assist effect in Parkinson’s disease patients
Sandnes [42]2016Investigating the features that people with low vision need in smart glasses
Ruminski et al. [43]2015Identifying the potential use of smart glasses in medical activities
TherapyMachado et al. [44]2019Assisting people with autism to acquire daily life skills
Liu et al. [45]2017System application using smart glasses for coaching to improve autism disorder
Vahabzadeh et al. [46]2018Socio-emotional and behavioral function treatment of autistic students by intervening using smart glasses
Table 3. Summary of smart glass applications in the industry field.
Table 3. Summary of smart glass applications in the industry field.
Sub-FieldReferencesYearAim of Study
MaintenanceWolfartsberger et al. [47]2020Development of technology that can support maintenance work on the job site
Siltanen and Heinonen [48]2020
SafetyChang et al. [49]2018Design and implementation of drowsiness fatigue monitoring system to improve road safety
Baek and Choi [11]2020Development of smart glass-based proximity warning system for pedestrians at a mining site
Work supportKirks et al. [50]2019Development of distributed control system based on smart glasses
Gensterblum [51]2020Examine the possibility that commercial drones and smart glasses are disruptive technologies in the construction industry.
Table 4. Summary of smart glass applications in the service field.
Table 4. Summary of smart glass applications in the service field.
Sub-FieldReferencesYearAim of Study
Culture and TourismHan et al. [9]2019Providing a framework for adopting smart glasses in cultural tourism
tom Dieck et al. [52]2015Improving the understanding of art and analyzing the effects of using smart glasses in museums
Mason [53]2016
E-CommerceHo et al. [54]2018Ticketing without input after automatically recognizing the license plate using smart glasses
Table 5. Summary of smart glass applications in the field of education.
Table 5. Summary of smart glass applications in the field of education.
Sub-FieldReferencesYearAim of Study
NursingKopetz et al. [55]2019User-oriented development of smart glasses for nursing education boarding training
Physical trainingBerkemeier et al. [56]2018Consumer perception survey for application to cycling training
GeneralCao et al. [57]2019Development of new augmented reality learning system for future informatization experiment
PhysicsSpitzer et al. [58]2018Distance learning and support using smart glasses
Kapp et al. [59]2018Enhance understanding of physics experiments through smart glasses
Thees et al. [60]2020
Strzys et al. [61]2018
Table 6. Summary of smart glass applications in the field of social science.
Table 6. Summary of smart glass applications in the field of social science.
Sub-FieldReferencesYearAim of Study
Information securityRauschnabel et al. [62]2018Research on consumer perception of smart glasses and personal information risk
Marketing and PsychologyAdapa et al. [63]2018Investigating factors influencing the selection of smart wearable devices
Rauschnabel et al. [64]2015
Rauschnabel et al. [65]2016Consumer perception of smart glasses as a fashion
Rallapalli et al. [66]2014Physical analysis of (indoor) retail stores using smart glasses
Hoogsteen et al. [67]2020Recognition of smart glasses by low-visibility persons
Ok et al. [68]2015Analysis of harmfulness and social perception in purchasing smart glasses
Table 7. Specification of Vuzix blade [70], Google’s Glass Enterprise Edition 2 [71], Epson’s Moverio BT-350 [72], and Microsoft’s HoloLens 2 [73].
Table 7. Specification of Vuzix blade [70], Google’s Glass Enterprise Edition 2 [71], Epson’s Moverio BT-350 [72], and Microsoft’s HoloLens 2 [73].
ItemGoogle Glass Enterprise
Edition 2
HoloLens 2
Moverio BT-350
Blade M100
Storage32 GB64 GB48 GB8 GB
CPUQualcomm XR1
1.7 GHz Quad-core
Qualcomm Snapdragon 850Intel Atom X5
(1.44 GHz Quad-Core)
Quad Core ARM
Battery800 mA⋅h16,500 mA⋅h2950 mA⋅h550 mA⋅h
Camera8-megapixel camera8-megapixel camera, 1080p video recording5-megapixel camera5-megapixel camera,
1080p video recording
OSAndroid Open Source Project 8.1 (Oreo)Windows 10Android 5.1Android ICS 4.04
Weight~46 g566 g151 g372 g
6-axis gyroscope
Azure Kinect sensor,
6-DoF (Degrees of Freedom),
3-axis gyroscope
3-axis gyroscope,
3-axis accelerometer,
3-axis magnetometer,
Pressure sensor
Fluoroscopy methodCurved mirror
(+Reflective waveguide):
Google Light pipe
Holographic waveguideReflective waveguide:
Epson light guide
Waveguide optics
Table 8. Number of studies based on data visualization method and their percentage.
Table 8. Number of studies based on data visualization method and their percentage.
TypeNumber of StudiesReferences
AR52 (91.2%)[7,8,9,10,11,18,19,20,21,22,23,24,25,26,27,28,29,30,31,32,33,34,35,36,38,39,40,41,42,43,44,45,46,48,50,51,52,53,54,55,56,57,58,60,62,63,64,65,66,67,68,69]
MR2 (3.5%)[47,59]
N/A3 (5.2%)[37,49,61]
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Kim, D.; Choi, Y. Applications of Smart Glasses in Applied Sciences: A Systematic Review. Appl. Sci. 2021, 11, 4956.

AMA Style

Kim D, Choi Y. Applications of Smart Glasses in Applied Sciences: A Systematic Review. Applied Sciences. 2021; 11(11):4956.

Chicago/Turabian Style

Kim, Dawon, and Yosoon Choi. 2021. "Applications of Smart Glasses in Applied Sciences: A Systematic Review" Applied Sciences 11, no. 11: 4956.

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop