Next Article in Journal
Piezoelectric and Electromechanical Characteristics of Porous Poly(Ethylene-co-Vinyl Acetate) Copolymer Films for Smart Sensors and Mechanical Energy Harvesting Applications
Next Article in Special Issue
User eXperience (UX) Evaluation for MR Cultural Applications: The CEMEC Holographic Showcases in European Museums
Previous Article in Journal
Lean-ing Method in an Emergency Department of the Italian Epicenter of the COVID-19 Outbreak: When the Algorithm Makes Difference
Previous Article in Special Issue
Digital Twins Driven Supply Chain Visibility within Logistics: A New Paradigm for Future Logistics
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Opportunities and Challenges of Smartglass-Assisted Interactive Telementoring

Division of Computer Engineering, Hanshin University, Osan-si 18101, Gyeonggi-do, Korea
Appl. Syst. Innov. 2021, 4(3), 56; https://doi.org/10.3390/asi4030056
Submission received: 1 July 2021 / Revised: 29 July 2021 / Accepted: 13 August 2021 / Published: 21 August 2021
(This article belongs to the Special Issue Advanced Virtual Reality Technologies and Their Applications)

Abstract

:
The widespread adoption of wearables, extended reality, and metaverses has accelerated the diverse configurations of remote collaboration and telementoring systems. This paper explores the opportunities and challenges of interactive telementoring, especially for wearers of smartglasses. In particular, recent relevant studies are reviewed to derive the needs and trends of telementoring technology. Based on this analysis, we define what can be integrated into smartglass-enabled interactive telementoring. To further illustrate this type of special use case for telementoring, we present five illustrative and descriptive scenarios. We expect our specialized use case to support various telementoring applications beyond medical and surgical telementoring, while harmoniously fostering cooperation using the smart devices of mentors and mentees at different scales for collocated, distributed, and remote collaboration.

1. Introduction

The number of computing devices that people encounter in their daily life is increasing. Beyond the typical form factor of a desktop computer, computing devices are becoming smaller, more abundant, and ubiquitously available. The vision of “ubiquitous computing” is more apparent today due to the myriad of connected everyday appliances known as the Internet-of-Things (IoT). Both ubiquitous computing and IoT emphasize environment-centric sensing and service provision through ambient intelligence. On the other hand, wearable computing complements the aforementioned environment-centric view with an intimate first-person view realized by always on, always accessible, and always connected wearables. Rapid advancements in ubiquitous computing and wearable computing have established a viable infrastructure to connect people over long distances to neutralize the barriers of physical locations. The recent COVID-19 pandemic has accelerated the use of this type of technology [1,2] in forms of remote collaboration, working-from-home, teleconferencing, online education, and metaverses enriched with various configurations of digital twins (DTs) [3], augmented reality (AR), virtual reality (VR), mixed reality (MR), and extended reality (XR).
There are numerous reports and surveys supporting the wide spread and adoption of wearables, XR, and metaverses in the foreseeable future. According to BCG and Motor Intelligence, XR markets are growing and will reach 30.7 billion USD in 2021 and 300 billion USD by 2024, respectively. Another report by the IDC forecast that consumer spending on AR and VR technology is approximately 6.36 billion USD, accounting for 53% of global AR and VR spending in 2020. A survey published in February 2021 by ARtillery Intelligence highlights that enterprise AR glasses hardware and software revenue will rise to 2.05 billion USD in 2021 and 11.99 billion USD by 2024, respectively. XR industry experts forecast that such immersive technologies will have a great impact on the healthcare and medical sector [4].
In this paper, we explore the opportunities and challenges of smartglass-assisted interactive telementoring and its applications. Specifically, our contribution is threefold, as follows:
  • First, we elicit requirements and trends by reviewing recent telementoring studies;
  • Second, we define a terminology “smartglass-assisted interactive telementoring (SAIT)” to refer to specialized use cases of telementoring and elaborate on its distinguishable characteristics;
  • Third, we identify opportunities and challenges of SAIT applications through five illustrative and descriptive scenarios.

2. Related Work

In this section, we review recent attempts at remote collaboration [5] and XR collaboration [6] using wearable displays and various interaction styles [7,8,9] to elicit requirements for effective telementoring. Rather than the systematic review or meta-review approaches elaborated in [10,11], we specifically focus on recent remote collaboration and telementoring studies from 2015 to 2021.
Schäfer et al. have presented a comprehensive survey on synchronous remote collaboration systems based on AR/VR/MR technology [5]. They have identified the category of “remote experts” to include collaboration systems involving a local and a remote user [5] that share similar fundamental concepts and application areas as XR collaboration [6], as well as our soon to be defined smartglass-assisted interactive telementoring approach.

2.1. Wearable Displays

Remote collaboration attempts to remove physical barriers between remote workers. One such technique is the capability of sharing a remote worker’s view and displaying it on a local site’s displays. Wearable displays such as head-mounted displays (HMDs) worn by collaborators enable the sharing of other users’ views, as well as supporting hands-free actions naturally. Wearable displays present additional information to the wearer. This characteristic has enabled various applications to be implemented using wearable displays. Wearable displays can be categorized as optical see-through (OST) or video see-through (VST) displays, and there are many trade-offs. In OST HMDs, the wearer can naturally see the real world through half-transparent mirrors placed in front of the wearer’s eyes, which optically reflect a combination of real and computer-generated images to the eyes without processing latency or delays [12]. In VST HMDs, the real world is first captured with video cameras of the wider field of view (FoV) mounted on the HMD, and then the computer-generated images are electronically combined over the captured video [12].
Google Glass launched in 2013 and demonstrated a viable use of OST HMDs through various situated applications in information visualization and augmented reality using images and instructions. Since then, commercial smartglasses from Microsoft [13,14,15,16,17,18,19,20,21], Google [22,23], HTC [7,8], Vuzix [24], Epson [25,26] and Brother [27] have integrated a wearable display with a computationally sufficient computing unit that provides connectivity and a multi-modal user interface. For example, the latest OST HMDs such as HoloLens 2 and MagicLeap One provide high-resolution displays (i.e., up to 2048 × 1080), a wide FoV (i.e., 52 and 50 degrees), and tracking and control capabilities. For a comprehensive review on HMD for medical usages, readers are referred to [28].

2.2. Medical and Surgical Telementoring

Telementoring is most beneficial when local and remote users have different degrees of knowledge, experience and expertise. Promising application domains for telementoring (satisfying the conditions for remoteness and mentoring as shown in Table 1) are medical, surgical, and healthcare domains in which there exist a sufficient knowledge and experience gap between two groups of people (i.e., also known as mentors and mentees). Medical or often surgical telementoring is a well-examined application area that exploits these conditions to enhance or improve the overall performance of the involved users, as shown in Table 1.
Ballantyne et al. stated that telementoring “permits an expert surgeon, who remains in their own hospital, to instruct a novice in a remote location on how to perform a new operation or use a new surgical technology [29].” El-Sabawi and Magee III explained that “surgical telementoring involves the use of information technology to provide real-time guidance and technical assistance in performing surgical procedures from an expert physician in a different geographical location. Similar to traditional mentoring, it plays a dual role of educating and providing care at the same time [30].” Semsar et al. defined that “these remote work practices entail a local worker performing a physical task supported by a remote expert who monitors the work and gives instructions when needed—sometimes referred to as telementoring [31].” From the aforementioned views and previous medical and surgical telementoring studies, two types of telementoring can be identified. The first type is sharing the view of a mentor to teach mentees where mentees can learn from the mentor’s direct and real-time practice via learning by examples or demonstration strategies. The second type is sharing the view of a mentee, so mentors can appropriately give advice and deliver feedback in real-time for training and instruction purposes. For example, guidelines can be provided to surgeons to train them to perform new operations [29,32]. Despite the differences in who takes the initiative in telementoring, both types of telementoring can benefit from high-definition videoconferencing, wearable technology, robotic telementoring platforms, and augmented reality [30]. For a comprehensive review on surgical, medical, and healthcare telementoring studies, readers are referred to [10,11].

2.3. Interaction Styles of Telementoring

Telementoring systems have employed numerous interaction techniques. Often, these interaction techniques were dominated by devices and displays used (i.e., mobile, wearables, and large-screen displays). Schäfer et al. identified nine common interactive elements in remote collaboration systems as 3D object manipulation, media sharing, AR annotations, 2D drawing, AR viewport sharing, mid-air drawing in 3D, hand gestures, shared gaze awareness, and conveying facial expression [5]. In our review of recent telementoring studies, we have similarly identified four main interaction styles as touch-based user interface, voice-based user interface, gesture-based user interface, and telestration user interface.

2.3.1. Touch-Based User Interface

A touch-based user interface (UI) is a common and direct interaction style used in many telementoring systems [16,17,18,19,33]. Most users are already familiar with touchscreens on smartphones and tablets, therefore touch-based UIs do not require much training time. However, in the medical and surgical domains, the use of hands is strictly restricted or prohibited to maintain hygiene. For other domains, touch-based UIs for wearables are popular and considered default. Smartglasses such as Google Glass, HoloLens, and Vuzix Blade all provide on-device touch-based UIs. Figure 1 shows illustrations of touch-based interaction styles where the wearer can control the device by touching it (i.e., touchpad).

2.3.2. Voice-Based User Interface

A voice-based UI is another common style of interaction found in telementoring systems. Voice-based UIs are used between mentors and mentees to provide timely feedback and verbally communicate using conversations to request appropriate guidance. Unlike touch-based UIs, voice-based UIs can be implemented entirely hands-free, avoiding hygienic concerns for medical and surgical telementoring. The use of naturally spoken language is intuitive and highly expressive. Moreover, the performance of speech recognizers is greatly improved recently with artificial intelligence. Nonetheless, situations and environments involving many speakers, noisy workplaces, microphone quality, and the requirements of large vocabulary pose implementation and deployment issues. Often, a microphone is integrated with the device used for telementoring (i.e., smartglasses and notebooks). Previous telementoring studies using smartglasses have provided variations of voice-based UIs [13,14,16,17,18,19,23,31,42]. Figure 2 illustrates hands-free and voice-based interaction styles.

2.3.3. Gesture-Based User Interface

A gesture-based UI mostly refers to exploiting hand gestures in telementoring systems. Commercial smartglasses such as Oculus Quest 2 and Microsoft HoloLens 2 provide hand tracking via computer vision or separate controllers. Furthermore, specialized hardware and software for data gloves and suits from Manus, BeBop sensors, Feel the Same, and Tesla can be adopted to include gesture-based UIs. Similar to voice-based UIs, gesture-based UIs can be used in hands-free and touch-less manner. On the downside, the less expressive gesture-based UIs often require additional and costly sensors, processing units, and heavy computation. In our review, several studies incorporated gesture-based UIs in their systems [22,36,42]. Figure 3 demonstrates typical touch-less gesture-based interaction styles.

2.3.4. Telestration User Interface

A telestrator refers to a device that can draw sketches over still or video images. In many telementoring systems, similar functionalities of pointing, annotating, and drawing are called telestration as depicted in Figure 4. Such telestration enables mentors to visually instruct and point out areas of interest quickly. Furthermore, telestration from mentees can be used to highlight problematic areas. Several studies in our review also used some forms of telestration in their systems [14,15,16,17,18,19,20,21,26,31,37,38,39,40], often with voice-based UIs [14,16,17,18,19,20,31,39].

2.3.5. Other Types of User Interfaces

There are other interaction styles such as using tangible UIs [25], foot pedal [27] and avatars [21]. We think that using foot pedals depicted in Figure 5, could go well with the aforementioned types of UIs (touch-based, voice-based, and gesture-based) and complement shortcomings of these interaction styles. For example, foot pedals could replace on-device and touch-based UIs in the medical and surgical settings. This interaction style can be used to confirm correctly recognized gestures or cancel incorrectly recognized gestures. Table 2 identifies telementoring systems with smartglasses and their employed interaction styles discussed in this section.

3. Smartglass-Assisted Interactive Telementoring

In our review of recent remote collaboration and telementoring studies in the previous section, we identified studies that use smartglasses and satisfy telementoring requirements as shown in Table 2. Using these selected studies, we define a specialized use case of telementoring, which we call “smartglass-assisted interactive telementoring (SAIT)”. As the terminology suggests, smartglasses are the required hardware display unlike traditional medical and surgical telementoring that allow other alternative displays [31]. More specifically, the SAIT has the following characteristics over typical remote collaboration or telementoring in general.
  • Mentors and Mentees of Different Levels. The SAIT involves two groups of users categorized as mentors and mentees. Mentors are the sufficiently more knowledgeable and experienced groups with professional or technical expertise. Mentees are the less knowledgeable and experienced groups who would benefit or improve their performance with the help received from their mentors.
  • Sharing of First-person and Eye-level View via Smartglass. Either or both mentors and mentees share their eye-level view captured by their HMDs with sufficient FoV to provide enough details and video quality for the intended telementoring tasks.
  • Extended Use Cases with Bi-directional Interaction. The SAIT aims to cover flexible and casual use cases beyond the medical and surgical telementoring scenarios. In the SAIT, mentors and mentees are preferably on the go, situated, and hands-free for exploiting multiple interaction channels for various occasions.
To elaborate on these characteristics of the SAIT applications, five illustrative and descriptive scenarios are presented in the next section.

4. Five Scenarios for Smartglass-Assisted Interactive Telementoring

As we have discussed in Section 2, many remote collaboration and telementoring systems have been developed in the healthcare sector. In a survey of XR industry experts, training simulations in the healthcare sector are the favored major application [4]. Table 3, Table 4, Table 5 and Table 6 show XR industry experts’ views on the promising XR applications in various domains reported in 2020 Augmented and Virtual Reality Survey Support [4]. Training simulations and assisted surgeries are the top applications of immersive technologies in the healthcare sector, as shown in Table 3. Immersive teaching experiences and soft skills development are expected to be a major application of XR technologies in the education sector, as shown in Table 4. In the manufacturing sector, real-time remote assistance and workforce training are expected to be major XR applications, as shown in Table 5. Improvements in navigation solutions and urban planning are the top applications of immersive technologies in smart cities, as shown in Table 6.
Based on various XR applications across different domains, we have chosen five specific scenarios to elaborate on the concept and characteristics of the SAIT. The selected telementoring scenarios include the SAIT for paramedics, maintenance, culinary education and certification, hardware engineers, and foreigner navigation.

4.1. SAIT for Paramedics Scenario

  • Mentors and Mentees of Different Levels. This SAIT scenario involves dispatched paramedics as mentees and more experienced mentors at the medical center. The paramedics are trained to perform various skills in emergency settings (i.e., airway management, breathing, circulation, cardiac arrest, intravenous infusion, patient assessment, wound management) and provide emergency medications. Mentors with more medical expertise can guide and instruct the less-experienced mentees (i.e., newly employed, treatment for exceptional cases).
  • Sharing of First-person and Eye-level View via Smartglass. The mentees can allow their views captured by HMDs to be shared when they need assistance. Then, the mentors can intervene to assess a patient’s condition visually or walk through medical treatment together with the mentee. Figure 6 illustrates dispatched paramedics mentees in this SAIT scenario.
  • Extended Use Cases with Bi-directional Interaction. In this scenario, voice-based UIs are used dominantly to communicate between mentors and mentees in real-time.

4.2. SAIT for Maintenance Scenario

  • Mentors and Mentees of Different Levels. This SAIT scenario involves on-site workers as mentees and remote experts as mentors. As illustrated in Figure 7, working conditions on the site may be too narrow and confined to hold multiple workers physically. In this scenario, mentors with more experience and expertise support and assist mentees to carry out maintenance or repairing tasks successfully.
  • Sharing of First-person and Eye-level View via Smartglass. The mentees share their views captured by HMDs with remote expert mentors. Then the mentors can intervene to assess the equipment and cables visually as well as instructing complex procedures to the mentees.
  • Extended Use Cases with Bi-directional Interaction. In this scenario, voice-based UIs are used to communicate between mentors and mentees in real-time. Furthermore, telestration UIs will enhance communication to be more clear and allow identifying problematic areas of interest. For example, the mentor can highlight parts to be examined or replaced.

4.3. SAIT for Culinary Education and Certification Scenario

  • Mentors and Mentees of Different Levels. This scenario involves mentees who are interested in culinary education or getting a culinary certification. It is helpful to be trained by experienced chefs when you are learning to cook something new that requires new recipes or techniques. For example, a barista trainee can learn from winners of the World Barista Championships or learn from renowned chefs as illustrated in Figure 8.
  • Sharing of First-person and Eye-level View via Smartglass. The mentors’ eye-level view captured by their HMDs can be shared with mentors when the mentees are assessed for getting certification. Another use case is that the mentor’s view captured by their HMDs are shared with multiple mentees to educate and train them by demonstration strategies.
  • Extended Use Cases with Bi-directional Interaction. In this scenario, voice-based UIs are used to communicate between mentors and mentees in real-time. Furthermore, telestration UIs can be used to draw attention to specific areas of interest. For example, the mentor can highlight where to cut.

4.4. SAIT for Hardware Engineers Scenario

  • Mentors and Mentees of Different Levels. This scenario enables mentees to receive help from expert hardware engineers. For example, teachers as mentors can teach school courses that require hardware engineering as shown in Figure 9. Remote mentors can also provide support for Do-It-Yourself (DIY) mentees.
  • Sharing of First-person and Eye-level View via Smartglass. It is often difficult to figure out hardware problems or explain the problem clearly. In this scenario, mentees share their eye-level view captured by their HMD smartglasses with mentors to receive supportive feedback.
  • Extended Use Cases with Bi-directional Interaction. In this scenario, voice-based UIs are used to communicate between mentors and mentees in real-time. Furthermore, telestration UI can be used to draw attention to specific areas of interest. For example, the mentor can highlight where to connect or dismantle.

4.5. SAIT for Foreigner Navigation

  • Mentors and Mentees of Different Levels. This scenario involves foreign travelers as mentees who do not have much information on where and how to find their destination as illustrated in Figure 10. It will be helpful if local experts can accompany them to guide their navigation. For this purpose, local residents can play the role of mentors to help the mentees in their travel.
  • Sharing of First-person and Eye-level View via Smartglass. The mentees’ eye-level view captured by their HMDs is shared with mentors. Since the mentors are familiar with the local site, they can give useful information such as where to find good restaurants or which public transportation to use.
  • Extended Use Cases with Bi-directional Interaction. In this scenario, voice-based UIs are used to communicate between mentors and mentees in real-time. Furthermore, telestration UIs can be used to draw attention to specific areas to explore and navigate. For example, the mentor can highlight and recommend popular landmarks to visit.

5. Opportunities and Challenges of SAIT

In this section, we discuss the foreseeable opportunities and challenges of the proposed use case of the SAIT.

5.1. Opportunities

As reviewed in our related work section, a great amount of remote collaboration and telementoring studies focus on the medical, surgical, and healthcare domains. We presented a specialized use case of telementoring as the SAIT with three defining characteristics of mentors and mentees of different levels, sharing of first-person and eye-level view via smartglass, and extended use cases with bi-directional interaction. We believe that there are more application domains to be explored for remote collaboration and telementoring. In an attempt to go beyond the typical medical and surgical applications, we have presented five illustrative and descriptive SAIT scenarios that are more casual, tied with the use of smartglasses, and across several domains of healthcare, education, manufacturing, and smart cities, as shown in Table 3, Table 4, Table 5 and Table 6.
Several interaction styles such as touch-based UIs [33,43], voice-based UIs [23] and telestration UIs for the SAIT are already in the mature stage (i.e., supported by Microsoft HoloLens 2, as shown in Table 2), while novel interaction styles are emerging. For example, gesture-based UIs for finer granularity such as HaptX Gloves and exploiting all five channels of human senses (i.e., Virtuix Omni treadmill) are being developed. Facebook Reality Labs envision contextually aware, AI-powered interface for AR glasses using electromyography signals to perform “intelligent click” and wristband haptics for feedbackFacebook also patented artificial reality hat as an alternative to traditional AR headsets and smartglasses. Apple announced AssistiveTouch for Apple Watch that can detect subtle differences in muscle movement and tendon activity. Moreover, the widespread of IoT and edge computing [44] enable flexible, adaptive, and seamless orchestration among smart devices and objects [44,45,46,47].

5.2. Technical and Societal Challenges

For our presented SAIT to be successful in XR telementoring, there are many technical as well as societal challenges to overcome. Many XR industry experts also have identified urgent areas of improvement [4] as shown in Figure 11.
Since the SAIT requires the use of smartglasses on mentors and mentees, various challenges of smartglasses are identified and being improved upon [48,49]. For example, to support the SAIT on everyday occasions, the weight of the HMD should be lighter. To provide a clear and crisp view captured by the HMD’s camera, wider FoV and higher resolution are preferred. Such specifications and performance measures of smartglasses including latency, content reproducibility, repeatability, synchronization, and warnings should be enforced through guidelines and standards. For example, there are international standards for VR and AR in environment safety (IEEE P2048.5), immersive user interface (IEEE P2048.6), eyewear display (IEC TR 63145-1-1:2018), and VR content (ISO/IEC TR 23842:2020), respectively.
Furthermore, XR experiences in our scenarios are limited since most informational contents (i.e., live video streams and annotations) are hardly considered as XR contents. Easy to use XR authoring tools for developers and service providers are desirable to provide multiple AR, VR, and XR applications [50] and experiences in the SAIT scenarios. Moreover, establishing sufficient network bandwidth for high-resolution videos is also important to share the view and contents [51] of mentors or mentees without interruptions and delays. As more interaction data between mentors and mentees are accumulated, we can benefit from deep learning approaches [52,53,54] that enhance various steps involved in the XR pipeline (i.e., registration, tracking, rendering and interaction, task guidance [55,56]).
There are societal challenges regarding the SAIT. Concerns of involved users’ security, safety and privacy issues should be carefully managed. As witnessed in the medical and surgical sectors, use of camera in the operation room is very sensitive and controversial. In most countries, installation of and monitoring with closed-circuit television (CCTV) are strictly prohibited. For such sensitive and intimate scenarios, mentors or mentees could be disallowed to use smartglasses at all. There should be a general consensus and regulations among the society about the needs and social acceptance for smartglasses in the discussed SAIT scenarios.
Table 7, Table 8, Table 9, Table 10 and Table 11 summarize various technical and societal issues aligned with the areas of improvement [4] in the discussed SAIT scenarios.

6. Conclusions

In this paper, we explored XR concepts of remote collaboration and telementoring focusing on the use of smartglasses. To elicit requirements and the trends of telementoring, we reviewed recent studies from 2015 to 2021. Based on the literature review, we defined what three characteristics the SAIT (smartglass-assisted interactive telementoring) features. To elaborate on this specialized use case of telementoring, we presented five illustrative and descriptive scenarios including paramedics, maintenance, culinary education and certification, hardware engineers, and foreigner navigation. Lastly, we identified opportunities, technical and societal challenges to overcome for successfully realizing the SAIT.
Our current approach has limitations and shortcomings that deserve further study. First, our approach was not the systematic review or meta-review in nature as other researchers have explored [10,11]. We investigated relevant studies mostly using commercially available smartglasses, which might have missed custom-designed prototypical approaches presented earlier than 2015. Indeed, as it turns out, our approach only covered high-fidelity smartglasses with stable hardware design, but there could be low-fidelity approaches of surgical telementoring applications [10,11] that we have not included in our analysis. Another limitation of our work is that our newly proposed SAIT scenarios are situated outdoor and involves multiple simultaneous user interactions. This complexity calls for carefully orchestrated security and privacy measures in wearable computing as well as XR settings which we were not able to cover due to the scope of the paper.
Nonetheless, we expect our proposed SAIT has its merits to support various telementoring applications beyond medical and surgical telementoring, while harmoniously fostering cooperation using the smart devices of mentors and mentees at different scales for collocated, distributed, and remote collaboration.

Funding

This work was supported by Hanshin University Research Grant.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Data is contained within the article.

Acknowledgments

The author thanks undergraduate researchers of Hanshin HCI Lab., Siyeon Kim, Yewon Park, Haneol Oh, and Hong Ji Lim for fruitful discussion and ideation on smartglass-assisted interactive telementoring scenarios. Cartoonized photos used in figures are originally from Pexels with the free license to use (https://www.pexels.com/license) (accessed on 29 July 2021).

Conflicts of Interest

The author declares no conflict of interest.

Abbreviations

The following abbreviations are used in this manuscript:
IoTInternet-of-Things
DTDigital Twin
ARAugmented Reality
VRVirtual Reality
MRMixed Reality
XReXtended Reality
SAITSmartglass-Assisted Interactive Telementoring
HMDHead-Mounted Displays
OSTOptical See-Through
VSTVideo See-Through
FoVField of View
UIUser Interface
DIYDo-It-Yourself
CCTVClosed-Circuit Television

References

  1. Majeed, A. Applications of machine learning and high-performance computing in the era of COVID-19. Appl. Syst. Innov. 2021, 4, 40. [Google Scholar] [CrossRef]
  2. Bala, L.; Kinross, J.; Martin, G.; Koizia, L.J.; Kooner, A.S.; Shimshon, G.J.; Hurkxkens, T.J.; Pratt, P.J.; Sam, A.H. A remote access mixed reality teaching ward round. Clin. Teach. 2021, 18, 386–390. [Google Scholar] [CrossRef] [PubMed]
  3. Singh, M.; Fuenmayor, E.; Hinchy, E.P.; Qiao, Y.; Murray, N.; Devine, D. Digital twin: Origin to future. Appl. Syst. Innov. 2021, 4, 36. [Google Scholar] [CrossRef]
  4. Perkins Coie LLP. 2020 Augmented and Virtual Reality Survey Report. Ind. Insights Into Future Immersive Technol. 2020, 4, 1–33. [Google Scholar]
  5. Schäfer, A.; Reis, G.; Stricker, D. A Survey on Synchronous Augmented, Virtual and Mixed Reality Remote Collaboration Systems. arXiv 2021, arXiv:2102.05998. Available online: https://arxiv.org/abs/2102.05998 (accessed on 29 July 2021).
  6. Lee, Y.; Yoo, B. XR collaboration beyond virtual reality: Work in the real world. J. Comput. Des. Eng. 2021, 8, 756–772. [Google Scholar] [CrossRef]
  7. Wang, P.; Zhang, S.; Bai, X.; Billinghurst, M.; Zhang, L.; Wang, S.; Han, D.; Lv, H.; Yan, Y. A gesture- and head-based multimodal interaction platform for MR remote collaboration. Int. J. Adv. Manuf. Technol. 2019, 105, 3031–3043. [Google Scholar] [CrossRef]
  8. Wang, P.; Bai, X.; Billinghurst, M.; Zhang, S.; He, W.; Han, D.; Wang, Y.; Min, H.; Lan, W.; Han, S. Using a head pointer or eye gaze: The effect of gaze on spatial AR remote collaboration for physical tasks. Interact. Comput. 2020, 32, 153–169. [Google Scholar] [CrossRef]
  9. Kim, H.; Young, J.; Medeiros, D.; Thompson, S.; Rhee, T. TeleGate: Immersive Multi-User Collaboration for Mixed Reality 360°Video. In Proceedings of the 2021 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW), Lisbon, Portugal, 21 March–1 April 2021. [Google Scholar] [CrossRef]
  10. Bui, D.T.; Barnett, T.; Hoang, H.T.; Chinthammit, W. Tele-mentoring using augmented reality technology in healthcare: A systematic review. Australas. J. Educ. Technol. 2021, 81–101. [Google Scholar] [CrossRef]
  11. Erridge, S.; Yeung, D.K.T.; Patel, H.R.H.; Purkayastha, S. Telementoring of Surgeons: A systematic review. Surg. Innov. 2018, 26, 95–111. [Google Scholar] [CrossRef]
  12. Rolland, J.P.; Fuchs, H. Optical versus video see-through head-mounted displays in medical visualization. Presence Teleoperators Virtual Environ. 2000, 9, 287–309. [Google Scholar] [CrossRef]
  13. Klinker, K.; Wiesche, M.; Krcmar, H. Digital transformation in health care: Augmented reality for hands-free service innovation. Inf. Syst. Front. 2020, 22, 1419–1431. [Google Scholar] [CrossRef] [Green Version]
  14. Hanna, M.G.; Ahmed, I.; Nine, J.; Prajapati, S.; Pantanowitz, L. Augmented reality technology using microsoft hololens in anatomic pathology. Arch. Pathol. Lab. Med. 2018, 142, 638–644. [Google Scholar] [CrossRef] [Green Version]
  15. Liu, P.; Li, C.; Xiao, C.; Zhang, Z.; Ma, J.; Gao, J.; Shao, P.; Valerio, I.; Pawlik, T.M.; Ding, C.; et al. A wearable augmented reality navigation system for surgical telementoring based on microsoft HoloLens. Ann. Biomed. Eng. 2020, 49, 287–298. [Google Scholar] [CrossRef] [PubMed]
  16. Rojas-Muñoz, E.; Andersen, D.; Cabrera, M.E.; Popescu, V.; Marley, S.; Zarzaur, B.; Mullis, B.; Wachs, J.P. Augmented reality as a medium for improved telementoring. Mil. Med. 2019, 184, 57–64. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  17. Rojas-Muñoz, E.; Cabrera, M.E.; Lin, C.; Sánchez-Tamayo, N.; Andersen, D.; Popescu, V.; Anderson, K.; Zarzaur, B.; Mullis, B.; Wachs, J.P. Telementoring in leg fasciotomies via mixed-reality: Clinical evaluation of the STAR platform. Mil. Med. 2020, 185, 513–520. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  18. Rojas-Muñoz, E.; Cabrera, M.E.; Lin, C.; Andersen, D.; Popescu, V.; Anderson, K.; Zarzaur, B.L.; Mullis, B.; Wachs, J.P. The system for telementoring with augmented reality (STAR): A head-mounted display to improve surgical coaching and confidence in remote areas. Surgery 2020, 167, 724–731. [Google Scholar] [CrossRef]
  19. Rojas-Muñoz, E.; Lin, C.; Sanchez-Tamayo, N.; Cabrera, M.E.; Andersen, D.; Popescu, V.; Barragan, J.A.; Zarzaur, B.; Murphy, P.; Anderson, K.; et al. Evaluation of an augmented reality platform for austere surgical telementoring: A randomized controlled crossover study in cricothyroidotomies. NPJ Digit. Med. 2020, 3, 75. [Google Scholar] [CrossRef] [PubMed]
  20. Mitsuno, D.; Hirota, Y.; Akamatsu, J.; Kino, H.; Okamoto, T.; Ueda, K. Telementoring demonstration in craniofacial surgery with HoloLens, skype, and three-layer facial models. J. Craniof. Surg. 2019, 30, 28–32. [Google Scholar] [CrossRef]
  21. Gasques, D.; Johnson, J.G.; Sharkey, T.; Feng, Y.; Wang, R.; Xu, Z.R.; Zavala, E.; Zhang, Y.; Xie, W.; Zhang, X.; et al. ARTEMIS: A Collaborative Mixed-Reality System for Immersive Surgical Telementoring. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems, Online Virtual Conference, 8–13 May 2021. [Google Scholar] [CrossRef]
  22. Kopetz, J.P.; Wessel, D.; Jochems, N. User-centered development of smart glasses support for skills training in nursing education. i-com 2019, 18, 287–299. [Google Scholar] [CrossRef]
  23. Yoon, H.; Kim, S.K.; Lee, Y.; Choi, J. Google glass-supported cooperative training forhealth professionals: A case study based onusing remote desktop virtual support. J. Multidiscip. Healthc. 2021, 2021, 1451–1462. [Google Scholar] [CrossRef]
  24. Kim, S.; Nussbaum, M.A.; Gabbard, J.L. Influences of augmented reality head-worn display type and user interface design on performance and usability in simulated warehouse order picking. Appl. Ergon. 2019, 74, 186–193. [Google Scholar] [CrossRef]
  25. Rochlen, L.R.; Levine, R.; Tait, A.R. First-person point-of-view–augmented reality for central line insertion training. Simul. Healthc. 2017, 12, 57–62. [Google Scholar] [CrossRef] [Green Version]
  26. Rio, M.D.; Meloni, V.; Frexia, F.; Cabras, F.; Tumbarello, R.; Montis, S.; Marini, A.; Zanetti, G. Augmented Reality for Supporting Real Time Telementoring: An Exploratory Study Applied to Ultrasonography. In Proceedings of the 2nd International Conference on Medical and Health Informatics, Tsukuba, Japan, 8–10 June 2018; pp. 218–222. [Google Scholar] [CrossRef]
  27. Huang, C.Y.; Thomas, J.B.; Alismail, A.; Cohen, A.; Almutairi, W.; Daher, N.S.; Terry, M.H.; Tan, L.D. The use of augmented reality glasses in centralline simulation: “See one, simulate many, do onecompetently, and teach everyone”. Adv. Med Educ. Pract. 2018, 2018, 357–363. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  28. Barteit, S.; Lanfermann, L.; Bärnighausen, T.; Neuhann, F.; Beiersmann, C. Augmented, mixed, and virtual reality-based head-mounted devices for medical education: Systematic review. JMIR Serious Games 2021, 9, e29080. [Google Scholar] [CrossRef] [PubMed]
  29. Ballantyne, G. Robotic surgery, telerobotic surgery, telepresence, and telementoring. Surg. Endosc. 2002, 16, 1389–1402. [Google Scholar] [CrossRef] [PubMed]
  30. El-Sabawi, B.; Magee, W., III. The evolution of surgical telementoring: Current applications and future directions. Ann. Transl. Med. 2016, 4, 391. [Google Scholar] [CrossRef] [Green Version]
  31. Semsar, A.; McGowan, H.; Feng, Y.; Zahiri, H.R.; Park, A.; Kleinsmith, A.; Mentis, H.M. Quality of and attention to instructions in telementoring. Proc. ACM Hum. Comput. Interact. 2020, 4, 1–21. [Google Scholar] [CrossRef]
  32. Greenberg, J.A.; Schwarz, E.; Paige, J.; Dort, J.; Bachman, S. At-home hands-on surgical training during COVID19: Proof of concept using a virtual telementoring platform. Surg. Endosc. 2021, 35, 1963–1969. [Google Scholar] [CrossRef]
  33. Kim, S.K.; Lee, Y.; Yoon, H.; Choi, J. Adaptation of extended reality smart glasses for core nursing skill training among undergraduate nursing students: Usability and feasibility study. J. Med. Internet Res. 2021, 23, 1438–8871. [Google Scholar] [CrossRef] [PubMed]
  34. Aebersold, M.; Voepel-Lewis, T.; Cherara, L.; Weber, M.; Khouri, C.; Levine, R.; Tait, A.R. Interactive anatomy-augmented virtual simulation training. Clin. Simul. Nurs. 2018, 15, 34–41. [Google Scholar] [CrossRef]
  35. Lacy, A.M.; Bravo, R.; Otero-Piñeiro, A.M.; Pena, R.; De Lacy, F.B.; Menchaca, R.; Balibrea, J.M. 5G-assisted telementored surgery. Br. J. Surg. 2019, 106, 1576–1579. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  36. Meijer, H.A.W.; Margallo, J.A.S.; Margallo, F.M.S.; Goslings, J.C.; Schijven, M.P. Wearable technology in an international telementoring setting during surgery: A feasibility study. BMJ Innov. 2017, 3, 189–195. [Google Scholar] [CrossRef]
  37. Andersen, D.; Popescu, V.; Cabrera, M.E.; Shanghavi, A.; Gomez, G.; Marley, S.; Mullis, B.; Wachs, J. Avoiding focus shifts in surgical telementoring using an augmented reality transparent display. Stud. Health Technol. Inform. 2016, 220, 9–14. [Google Scholar] [CrossRef] [PubMed]
  38. Andersen, D.; Popescu, V.; Cabrera, M.E.; Shanghavi, A.; Gomez, G.; Marley, S.; Mullis, B.; Wachs, J.P. Medical telementoring using an augmented reality transparent display. Surgery 2016, 159, 1646–1653. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  39. Mentis, H.M.; Feng, Y.; Semsar, A.; Ponsky, T.A. Remotely Shaping the View in Surgical Telementoring. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems, Honolulu, HI, USA, 25–30 April 2020. [Google Scholar] [CrossRef]
  40. Lin, C.; Rojas-Munoz, E.; Cabrera, M.E.; Sanchez-Tamayo, N.; Andersen, D.; Popescu, V.; Noguera, J.A.B.; Zarzaur, B.; Murphy, P.; Anderson, K.; et al. How About the Mentor? Effective Workspace Visualization in AR Telementoring. In Proceedings of the 2020 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), Atlanta, GA, USA, 22–26 March 2020. [Google Scholar] [CrossRef]
  41. Hassan, A.; Ghafoor, M.; Tariq, S.A.; Zia, T.; Ahmad, W. High efficiency video coding (HEVC)–based surgical telementoring system using shallow convolutional neural network. J. Digit. Imaging 2019, 32, 1027–1043. [Google Scholar] [CrossRef]
  42. Giuseppe, S.; Fridolin, W.; Peter, S. The GhostHands UX: Telementoring with hands-on augmented reality instruction. Ambient. Intell. Smart Environ. 2015, 19, 236–243. [Google Scholar] [CrossRef]
  43. Yoon, H.; Park, S.H. A non-touchscreen tactile wearable interface as an alternative to touchscreen-based wearable devices. Sensors 2020, 20, 1275. [Google Scholar] [CrossRef] [Green Version]
  44. Khan, L.U.; Yaqoob, I.; Tran, N.H.; Kazmi, S.M.A.; Dang, T.N.; Hong, C.S. Edge-computing-enabled smart cities: A comprehensive survey. IEEE Internet Things J. 2020, 7, 10200–10232. [Google Scholar] [CrossRef] [Green Version]
  45. Ng, P.C.; She, J.; Jeon, K.E.; Baldauf, M. When smart devices interact with pervasive screens. ACM Trans. Multimed. Comput. Commun. Appl. 2017, 13, 1–23. [Google Scholar] [CrossRef]
  46. Houben, S.; Marquardt, N.; Vermeulen, J.; Klokmose, C.; Schöning, J.; Reiterer, H.; Holz, C. Opportunities and challenges for cross-device interactions in the wild. Interactions 2017, 24, 58–63. [Google Scholar] [CrossRef]
  47. Brudy, F.; Holz, C.; Rädle, R.; Wu, C.J.; Houben, S.; Klokmose, C.N.; Marquardt, N. Cross-Device Taxonomy. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems, Glasgow, Scotland, UK, 4–9 May 2019. [Google Scholar] [CrossRef] [Green Version]
  48. Danielsson, O.; Holm, M.; Syberfeldt, A. Augmented reality smart glasses in industrial assembly: Current status and future challenges. J. Ind. Inf. Integr. 2020, 20, 100175. [Google Scholar] [CrossRef]
  49. Rejeb, A.; Keogh, J.G.; Leong, G.K.; Treiblmaier, H. Potentials and challenges of augmented reality smart glasses in logistics and supply chain management: A systematic literature review. Int. J. Prod. Res. 2021, 59, 3747–3776. [Google Scholar] [CrossRef]
  50. Lebeck, K.; Kohno, T.; Roesner, F. Enabling Multiple Applications to Simultaneously Augment Reality. In Proceedings of the 20th International Workshop on Mobile Computing Systems and Applications, Santa Cruz, CA, USA, 27–28 February 2019. [Google Scholar] [CrossRef]
  51. Ruth, K.; Kohno, T.; Roesner, F. Secure Multi-User Content Sharing for Augmented Reality Applications. In Proceedings of the 28th USENIX Conference on Security Symposium, Santa Clara, CA, USA, 14–16 August 2019. [Google Scholar]
  52. Lampropoulos, G.; Keramopoulos, E.; Diamantaras, K. Enhancing the functionality of augmented reality using deep learning, semantic web and knowledge graphs: A review. Vis. Inform. 2020, 4, 32–42. [Google Scholar] [CrossRef]
  53. Wang, S.; Zargar, S.A.; Yuan, F.G. Augmented reality for enhanced visual inspection through knowledge-based deep learning. Struct. Health Monit. 2020, 20, 426–442. [Google Scholar] [CrossRef]
  54. Tai, Y.; Gao, B.; Li, Q.; Yu, Z.; Zhu, C.; Chang, V. Trustworthy and intelligent COVID-19 diagnostic IoMT through XR and deep learning-based clinic data access. IEEE Internet Things J. 2021. [Google Scholar] [CrossRef]
  55. Park, K.B.; Choi, S.H.; Kim, M.; Lee, J.Y. Deep learning-based mobile augmented reality for task assistance using 3D spatial mapping and snapshot-based RGB-D data. Comput. Ind. Eng. 2020, 146, 106585. [Google Scholar] [CrossRef]
  56. Park, K.B.; Kim, M.; Choi, S.H.; Lee, J.Y. Deep learning-based smart task assistance in wearable augmented reality. Robot. Comput. Integr. Manuf. 2020, 63, 101887. [Google Scholar] [CrossRef]
Figure 1. Illustrations of touch-based interaction styles.
Figure 1. Illustrations of touch-based interaction styles.
Asi 04 00056 g001
Figure 2. Illustrations of voice-based interaction styles.
Figure 2. Illustrations of voice-based interaction styles.
Asi 04 00056 g002
Figure 3. Illustrations of gesture-based interaction styles.
Figure 3. Illustrations of gesture-based interaction styles.
Asi 04 00056 g003
Figure 4. Illustrations of telestration interaction styles.
Figure 4. Illustrations of telestration interaction styles.
Asi 04 00056 g004
Figure 5. Illustrations of an interaction style using foot pedals.
Figure 5. Illustrations of an interaction style using foot pedals.
Asi 04 00056 g005
Figure 6. An illustration of SAIT for paramedics scenario.
Figure 6. An illustration of SAIT for paramedics scenario.
Asi 04 00056 g006
Figure 7. An illustration of SAIT for maintenance scenario.
Figure 7. An illustration of SAIT for maintenance scenario.
Asi 04 00056 g007
Figure 8. An illustration of SAIT for culinary education and certification scenario.
Figure 8. An illustration of SAIT for culinary education and certification scenario.
Asi 04 00056 g008
Figure 9. An illustration of SAIT for hardware engineer scenario.
Figure 9. An illustration of SAIT for hardware engineer scenario.
Asi 04 00056 g009
Figure 10. An illustration of SAIT for foreigner navigation scenario.
Figure 10. An illustration of SAIT for foreigner navigation scenario.
Asi 04 00056 g010
Figure 11. Areas of improvement in XR/AR/VR/MR technology, source: [4].
Figure 11. Areas of improvement in XR/AR/VR/MR technology, source: [4].
Asi 04 00056 g011
Table 1. Post-2015 Remote Collaboration and Telementoring Studies.
Table 1. Post-2015 Remote Collaboration and Telementoring Studies.
ReferenceRemotenessMentoringDomain
Kim et al. [33]NoNoNursing Education
Klinker et al. [13]NoNoMedical
Kopetz et al. [22]NoNoNursing Education
Rochlen et al. [25]NoNoMedical
Huang et al. [27]NoNoMedical
Aebersold et al. [34]NoNoMedical
Hanna et al. [14]YesYesMedical
Yoon et al. [23]YesYesMedical
Lacy et al. [35]YesYesMedical
Rio et al. [26]YesYesMedical
Greenberg et al. [32]YesYesMedical
Liu et al. [15]YesYesMedical
Meijer et al. [36]YesYesMedical
Andersen et al. [37,38]YesYesMedical
STAR [16,17,18,19]YesYesMedical
Mitsuno et al. [20]YesYesMedical
Mentis et al. [39]YesYesMedical, Industrial
Lin et al. [40]YesYesMedical
Gasques et al. [21]YesYesMedical
Hassan et al. [41]YesYesMedical
Giuseppe et al. [42]YesYesIndustrial
Semsar et al. [31]YesYesMedical
Wang et al. [7]YesYesIndustrial
Wang et al. [8]YesYesPhysical Task
Table 2. Telementoring Studies with Smartglasses and Interaction Styles.
Table 2. Telementoring Studies with Smartglasses and Interaction Styles.
ReferenceTelementoringSmartglassInteraction Styles
Kim et al. [33]NoVuzix BladeTouch-based
Klinker et al. [13]NoMS HoloLensVoice-based
Kopetz et al. [22]NoGoogle GlassGesture-based
Rochlen et al. [25]NoEpson Moverio BT-200Tangible UI
Huang et al. [27]NoBrother AiRScouter WWD-200BWireless Two-button Foot Pedal
Aebersold et al. [34]NoN/AVideo, Quiz
Hanna et al. [14]YesMS HoloLensVoice-based, Telestration
Yoon et al. [23]YesGlass Enterprise Edition 2Voice-based
Lacy et al. [35]YesN/AVoice-based
Rio et al. [26]YesEpson Moverio BT-200Telestration
Greenberg et al. [32]YesN/AVoice-based
Liu et al. [15]YesMS HoloLensTelestration
Meijer et al. [36]YesN/AGesture-based
Andersen et al. [37,38]YesN/ATouch-based, Telestration
STAR [16,17,18,19]YesMS HoloLensTouch-based, Voice-based, Telestration
Mitsuno et al. [20]YesMS HoloLensVoice-based, Telestration
Mentis et al. [39]YesN/AVoice-based, Telestration
Lin et al. [40]YesMS HoloLensTelestration
Gasques et al. [21]YesMS HoloLens, HTC Vive ProTelestration, Avatar
Hassan et al. [41]YesN/AN/A
Giuseppe et al. [42]YesHead-mounted CameraVoice-based, Gesture-based
Semsar et al. [31]YesN/AVoice-based, Telestration
Wang et al. [7]YesHTC Vive HMDGesture-based, Head Pointer, Telestration
Wang et al. [8]YesHTC Vive HMDGesture-based, Head Pointer, Eye Gaze, Telestration
Table 3. Top 7 XR Applications in Healthcare, source: [4].
Table 3. Top 7 XR Applications in Healthcare, source: [4].
Top 7 XR Applications in HealthcareShare of Respondents
Training simulations68%
Assisted surgeries56%
Studying diseases like cancer in 3D53%
Addressing visual disorders46%
Pain management43%
Emergency navigation42%
Assessing and addressing mental health conditions40%
Table 4. Top 5 XR Applications in Education, source: [4].
Table 4. Top 5 XR Applications in Education, source: [4].
Top 5 XR Applications in EducationShare of Respondents
Immersive teaching experiences66%
Soft skills development57%
Build interactive 3D models for learning55%
Exploratory expeditions49%
Recreation/simulation of past experiences for new learners40%
Table 5. Top 6 XR Applications in Manufacturing, source: [4].
Table 5. Top 6 XR Applications in Manufacturing, source: [4].
Top 6 XR Applications in ManufacturingShare of Respondents
Offering real-time remote assistance75%
Workforce training48%
Reduction of assembly errors45%
Improvement of supply chain management45%
Remote inspection and maintenance43%
Prevention of accidents40%
Table 6. Top 5 XR Applications in Smart Cities, source: [4].
Table 6. Top 5 XR Applications in Smart Cities, source: [4].
Top 5 XR Applications in Smart CitiesShare of Respondents
Navigation66%
Urban planning57%
Smart traffic signals and camera monitoring55%
Smart parking49%
Smart building management systems40%
Table 7. User experience issues in SAIT scenarios.
Table 7. User experience issues in SAIT scenarios.
SAIT ScenariosUser Experience Issues
ParamedicsCapturing a mentor’s view in sufficient detail and FoV for identifying patient conditions
MaintenanceCapturing a mentor’s view in sufficient detail and FoV for identifying equipment conditions and problems
Culinary Education & CertificationProviding mentees with unobtrusive and naturalistic UX
Hardware EngineersEnhancing and enlarging a mentor’s view in sufficient detail for working with small parts
Foreigner NavigationProviding accurate localization and culture-awareness to mentors
Table 8. Privacy and security issues in SAIT scenarios.
Table 8. Privacy and security issues in SAIT scenarios.
SAIT ScenariosPrivacy and Security Issues
ParamedicsKeeping a mentor’s view on patients private and secure
MaintenanceKeeping a mentor’s view on confidential and private facilities private and secure
Culinary Education & CertificationKeeping training and certification results private and secure
Hardware EngineersNo significant issues
Foreigner NavigationNo significant issues
Table 9. Seamless connectivity issues in SAIT scenarios.
Table 9. Seamless connectivity issues in SAIT scenarios.
SAIT ScenariosSeamless Connectivity Issues
ParamedicsPreparing stable and sufficient network bandwidth for high-quality audio/video (A/V) streaming in outdoor and unprepared environments, for identifying patient conditions
MaintenancePreparing stable and sufficient network bandwidth for high-quality A/V streaming in outdoor and unprepared environments, for identifying equipment conditions and problems
Culinary Education & CertificationPreparing stable and sufficient network bandwidth for high-quality audio/video (A/V) streaming and recording in indoor environments
Hardware EngineersNo significant issues
Foreigner NavigationPreparing stable connection and accurate indoor/outdoor localization
Table 10. Error prevention issues in SAIT scenarios.
Table 10. Error prevention issues in SAIT scenarios.
SAIT ScenariosError Prevention Issues
ParamedicsPresenting real-time audio feedback for mentor-mentee communication
MaintenancePresenting real-time audio feedback, interactive instruction slides, and an official manual for reference for mentor-mentee communication
Culinary Education & CertificationKeeping a track of mentors’ mistakes for possible de-briefing afterward
Hardware EngineersPresenting real-time audio feedback, interactive instruction slides, and an official manual for reference for mentor-mentee communication
Foreigner NavigationProviding navigation information in the mentor’s language
Table 11. Compatibility & interoperability issues in SAIT scenarios.
Table 11. Compatibility & interoperability issues in SAIT scenarios.
SAIT ScenariosCompatibility & Interoperability Issues
ParamedicsSupporting mentors’ smartglasses and mentors’ differently configured computers
MaintenanceSupporting mentors’ smartglasses, mentors’ differently configured computers, and automatic equipment classification by deep learning
Culinary Education & CertificationSupporting applications and services across different education and certification organizations
Hardware EngineersSupporting virtualized mentor assistance for pre-recorded FAQs and animated instruction videos
Foreigner NavigationSupporting various modes of navigation such as cars, walks, cycle, and public transportation
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Yoon, H. Opportunities and Challenges of Smartglass-Assisted Interactive Telementoring. Appl. Syst. Innov. 2021, 4, 56. https://doi.org/10.3390/asi4030056

AMA Style

Yoon H. Opportunities and Challenges of Smartglass-Assisted Interactive Telementoring. Applied System Innovation. 2021; 4(3):56. https://doi.org/10.3390/asi4030056

Chicago/Turabian Style

Yoon, Hyoseok. 2021. "Opportunities and Challenges of Smartglass-Assisted Interactive Telementoring" Applied System Innovation 4, no. 3: 56. https://doi.org/10.3390/asi4030056

APA Style

Yoon, H. (2021). Opportunities and Challenges of Smartglass-Assisted Interactive Telementoring. Applied System Innovation, 4(3), 56. https://doi.org/10.3390/asi4030056

Article Metrics

Back to TopTop