Virtual Reality (VR) Simulation and Augmented Reality (AR) Navigation in Orthognathic Surgery: A Case Report

: VR and AR technology have gradually developed to the extent that they could help operators in the surgical ﬁeld. In this study, we present a case of VR simulation for preoperative planning and AR navigation applied to orthognathic surgery. The average difference between the preplanned data and the post-operative results was 3.00 mm, on average, and the standard deviation was 1.44 mm. VR simulation could provide great advantages for 3D medical simulations, with accurate manipulation and immersiveness. AR navigation has great potential in medical application; its advantages include displaying real time augmented 3D models of patients. Moreover, it is easily applied in the surgical ﬁeld, without complicated 3D simulations or 3D-printed surgical guides.


Introduction
Orthognathic surgery is a popular surgery performed by oral and maxillofacial surgeons. The role of new dental application technologies are focused on the early detection of pathologic lesions, with noninvasive screening for surgical applications, using threedimensional (3D) simulation and navigation [1].
Nowadays, most types of orthognathic surgery are computer-assisted (i.e., computer simulation and fabrication of surgical guides for accurate surgery). Computer-assisted surgery includes surgical planning with simulation software and fabrication of customized surgical guides via 3D printing. However, the process of manufacturing surgical guides is time-consuming and involves additional costs. Furthermore, surgical guides cannot respond to various unexpected clinical situations [2]. A virtual reality (VR) medical simulation is generally composed of a virtual patient in a medical 3D model, with a reconstructed pathological lesion, surgical instruments, and tools for a VR interface [3]. Surgical simulation can play an important role in the way we approach surgical training and how we prepare for challenging cases in the operation [4]. VR simulation has many advantages, such as immersiveness, easy simulation without learning the software, and direct handling of 3D models. These benefits can provide the clinician with fast and easy simulation for preoperative planning. Augmented reality (AR) is a technology that superimposes and displays a virtual three-dimensional model on the operator's field of view. AR can provide the operator with a 3D computer-generated model superimposed onto real objects, all in real time [5]. When using this as a navigation system, it is possible to provide immediate guidance by intuitively expressing necessary information for the surgeon, such as anatomical structure in a deep area or as accurate surgical guidance. As a result, it is expected that the accuracy of the operation will be improved and the operation time will be reduced. It can provide more clear information, make the users improve safety, and lower the risks [6]. As image recognition and tracking technology improve gradually, real patient can be tracking by combining with camera system and optical markers. [7] This study presents the application of VR planning and an AR-guided system for orthognathic surgery in Class III dentofacial deformity patient.

Case Presentation
This study protocol was approved by the Institutional Ethics Committee (CUDHIRB 2007003). Written informed consent was obtained from the patient.
A 24-year-old male visited the department of oral and maxillofacial surgery for orthognathic surgery. Radiographic optical markers (Marker, Dio Co., Ltd.; Busan, Korea) were attached to the patient's face; a cone beam CT was used (Hitachi, Marunouchi, Japan), and facial scan data were obtained with a handheld scanner (Artec Space Spider; Artec Group; Luxembourg). Skin and skull 3D models were reconstructed with Mimics software (18.0, Materialise, Leuven, Belgium), facial scan and oral scan 3D models were imported. With a registration system (Mimics software), the facial scan model was registered to the 3D skin model generated from the CBCT, based on the optical markers ( Figure 1).
Appl. Sci. 2021, 11, 5673 2 of 8 safety, and lower the risks [6]. As image recognition and tracking technology improve gradually, real patient can be tracking by combining with camera system and optical markers. [7] This study presents the application of VR planning and an AR-guided system for orthognathic surgery in Class III dentofacial deformity patient.

Case Presentation
This study protocol was approved by the Institutional Ethics Committee (CUDHIRB 2007003). Written informed consent was obtained from the patient.
A 24-year-old male visited the department of oral and maxillofacial surgery for orthognathic surgery. Radiographic optical markers (Marker, Dio Co., Ltd.; Busan, Korea) were attached to the patient's face; a cone beam CT was used (Hitachi, Marunouchi, Japan), and facial scan data were obtained with a handheld scanner (Artec Space Spider; Artec Group; Luxembourg). Skin and skull 3D models were reconstructed with Mimics software (18.0, Materialise, Leuven, Belgium), facial scan and oral scan 3D models were imported. With a registration system (Mimics software), the facial scan model was registered to the 3D skin model generated from the CBCT, based on the optical markers ( Figure  1). VR simulation was performed in virtual space with segmented and registered 3D models, by controlling with a controller (Vive Pro, HTC Vive, Taipei, Taiwan), wearing the head mount display (HMD). The maxilla position was adjusted for posterior impaction and advancement, and then the distal segment of the mandible was moved for the optimal position, matched with the maxillary position ( Figure 2). VR simulation was performed in virtual space with segmented and registered 3D models, by controlling with a controller (Vive Pro, HTC Vive, Taipei, Taiwan), wearing the head mount display (HMD). The maxilla position was adjusted for posterior impaction and advancement, and then the distal segment of the mandible was moved for the optimal position, matched with the maxillary position ( Figure 2).
In addition, the intermediate and final splint were designed for surgical application and then exported as STL files for 3D printing (Figure 3).  The 3D models were implemented with an AR guide system by Unity 3D software. We developed a four-point tracking algorithm, which was tested with a dental cast to evaluate the tracking accuracy. Tracking accuracy was 0.55 mm on average. The four point auto-tracking algorithm was applied for intraoperative AR display of the pre-planned 3D model with a 4k RGB camera ( Figure 4). In addition, the intermediate and final splint were designed for surgical application and then exported as STL files for 3D printing ( Figure 3). The 3D models were implemented with an AR guide system by Unity 3D software. We developed a four-point tracking algorithm, which was tested with a dental cast to evaluate the tracking accuracy. Tracking accuracy was 0.55 mm on average. The four point auto-tracking algorithm was applied for intraoperative AR display of the pre-planned 3D model with a 4k RGB camera ( Figure 4). The 3D models were implemented with an AR guide system by Unity 3D software. We developed a four-point tracking algorithm, which was tested with a dental cast to evaluate the tracking accuracy. Tracking accuracy was 0.55 mm on average. The four point auto-tracking algorithm was applied for intraoperative AR display of the pre-planned 3D model with a 4k RGB camera ( Figure 4).
To apply the AR navigation system, first, we set the optical markers onto the patient's face; CT and face scanning was then carried out. The 3D models were registered to the real patient's face, with the four-point tracking algorithm registering manually during operation. Finally, the AR navigation system automatically tracked the markers and projected the 3D model onto the actual patient's operating field. A 4K RGB camera acquired the "movie" from the actual patient, and a 4k display monitor showed an augmented 3D model in real time ( Figure 5).
For accurate surgery, LeFort I osteotomy was performed according to the pre-planned osteotomy line under AR navigation. After the osteotomy was completed, the intermediate splint was applied and then the maxilla position was confirmed with AR navigation. After intermaxillary fixation, the maxilla was fixed with miniplates and then the final splint was applied for setting the mandibular position. Finally, the position of the repositioned maxilla and mandible were confirmed with AR navigation. Two weeks after operation, the CT scan and facial scan were taken for postoperative evaluation. Post-operative CT and facial scan data were registered to the preoperative 3D models, based on Mimics software. For comparison with the preoperative plan, reference points were set, such as anterior nasal spine (ANS), posterior nasal spine (PNS), right maxillary first molar mesiobuccal cusp (Right MxM1), left maxillary first molar mesiobuccal cusp (Left MxM1), pogonion point (Pog), right mandibular molar mesiobuccal cusp (Right MnM1), and left mandibular molar mesiobuccal cusp (Left MnM1). The differences of the reference points were measured as three-dimensional linear distances ( Figure 6).
The differences between the preoperative plan and postoperative 3D model were 1. 18   The 3D models were implemented with an AR guide system by Unity 3D software. We developed a four-point tracking algorithm, which was tested with a dental cast to evaluate the tracking accuracy. Tracking accuracy was 0.55 mm on average. The four point auto-tracking algorithm was applied for intraoperative AR display of the pre-planned 3D model with a 4k RGB camera (Figure 4).  To apply the AR navigation system, first, we set the optical markers onto the patient's face; CT and face scanning was then carried out. The 3D models were registered to the real patient's face, with the four-point tracking algorithm registering manually during operation. Finally, the AR navigation system automatically tracked the markers and projected the 3D model onto the actual patient's operating field. A 4K RGB camera acquired the "movie" from the actual patient, and a 4k display monitor showed an augmented 3D model in real time ( Figure 5). For accurate surgery, LeFort I osteotomy was performed according to the preplanned osteotomy line under AR navigation. After the osteotomy was completed, the  To apply the AR navigation system, first, we set the optical markers onto the patient's face; CT and face scanning was then carried out. The 3D models were registered to the real patient's face, with the four-point tracking algorithm registering manually during operation. Finally, the AR navigation system automatically tracked the markers and projected the 3D model onto the actual patient's operating field. A 4K RGB camera acquired the "movie" from the actual patient, and a 4k display monitor showed an augmented 3D model in real time ( Figure 5). For accurate surgery, LeFort I osteotomy was performed according to the preplanned osteotomy line under AR navigation. After the osteotomy was completed, the intermediate splint was applied and then the maxilla position was confirmed with AR navigation. After intermaxillary fixation, the maxilla was fixed with miniplates and then the final splint was applied for setting the mandibular position. Finally, the position of the repositioned maxilla and mandible were confirmed with AR navigation. Two weeks after operation, the CT scan and facial scan were taken for postoperative evaluation. Post-operative CT and facial scan data were registered to the preoperative 3D models, based on Mimics software. For comparison with the preoperative plan, reference points were set,  (Table 1).

Discussion
Digital technology is used to perform operations with accuracy. Since the introduction of computer-assisted surgery, CBCT has been used to create surgical guides that apply to implant placement, tumor resection and reconstruction, and orthognathic surgery performed by oral and maxillofacial surgeons. Recently, orthognathic surgery has been performed as a type of computer-assisted surgery, including preoperative planning by computer simulation and fabrication of surgical guides by 3D printing technology. Previous studies showed a clinically acceptable error in the range of 0.18 to 1.7 mm compared to preoperative planning data and postoperative results [8][9][10][11][12].
VR simulation is attracting modality because it has been shown to improve operative accuracy, efficiency, and outcomes [13]. Fushima and Kobayashi suggested a mixed reality-based system that synchronizes the movement of dental models in the real world and a 3D mesh model in the virtual world for orthognathic surgery [14]. Wang et al. applied a Figure 6. The pre-planned 3D model and postoperative 3D model were fused and measured the differences at bony landmarks.

Discussion
Digital technology is used to perform operations with accuracy. Since the introduction of computer-assisted surgery, CBCT has been used to create surgical guides that apply to implant placement, tumor resection and reconstruction, and orthognathic surgery performed by oral and maxillofacial surgeons. Recently, orthognathic surgery has been performed as a type of computer-assisted surgery, including preoperative planning by computer simulation and fabrication of surgical guides by 3D printing technology. Previous studies showed a clinically acceptable error in the range of 0.18 to 1.7 mm compared to preoperative planning data and postoperative results [8][9][10][11][12].
VR simulation is attracting modality because it has been shown to improve operative accuracy, efficiency, and outcomes [13]. Fushima and Kobayashi suggested a mixed realitybased system that synchronizes the movement of dental models in the real world and a 3D mesh model in the virtual world for orthognathic surgery [14]. Wang et al. applied a VR-based simulation system for mandibular angle reduction surgery [15]. In this study, we could easily make a plan via VR simulation. VR simulation showed great potential in surgical planning, in terms of convenience, intuition, and immersiveness compared to a 2D monitor display and mouse input interface. AR navigation surgery was introduced with technological development in the field of AR tracking accuracy. The AR guide (to guide implant placement position) showed errors of 0.53, 0.50, 0.46, and 0.48 mm [16]. In addition, an error of 1.63 mm appeared when the surgeon placed it at 1.25 mm, according to experience, and the error was small when using the AR guide [17]. The technology to track a marker with AR technology showed an accuracy of a maximum of 1.03 mm, an average of 0.71 mm, and a standard deviation of 0.27 mm [18]. Studies suggest that AR systems are becoming comparable to traditional navigation techniques, with precision, and sufficient safety for routine clinical practice [6].
The deviation in our finding was measured somewhat higher than other AR-based navigation surgeries, especially the mandible position. This deviation is considered a simulation error, in that preoperative simulation could not reproduce the exact mandibular movement in terms of the condyle position. In addition, since we attempted to introduce AR navigation as a simple device in the surgical process, more accurate results can be obtained when depth cameras and marker trackers are used.
When an AR navigation system is applied to the operation, we could experience a small delay when compared to a traditional navigation system. It is possible that this happens because (i) the 4K camera takes the scene and then sends it to the computer, along with numerous data; (ii) the computer needs to calculate the data, integrating the 3D models to display onto the monitor. This problem can be resolved by replacing with a high quality GPU.
There are many important structures in the oral and maxillofacial areas, including nerves and blood vessels [19]. To preserve these anatomical structures, it is necessary to improve the accuracy in the surgical filed. When AR navigation is applied to the face with optical makers-metal objects, such as oral and nose piercings, can influence the tracking system [20].
Therefore, it is necessary to remove all face piercings during operation. AR technology is appropriate for this purpose and well-suited to the currently preferred minimally invasive philosophy in maxillofacial surgery [19]. Applications of an AR navigation system in maxillofacial surgery have been extended to orthognathic surgery, tumor surgery, temporomandibular joint motion analysis, foreign body removal, osteotomy, minimally invasive biopsy, prosthetic surgery, and dental implantation [21]. One of its chief attractions is providing information on deep-tissue structures during the operation, allowing surgery to be less invasive [5]. For those reasons, orthognathic surgery is one of the most widely used fields in AR applications [19].
Recently, VR and AR have created a synergy effect in producing excellent therapeutic tools [22]. VR-based surgical simulation often combines AR navigation to guide the operation, with useful information, such as the patient's anatomy and/or preoperative planning [23]. In the future, VR and AR will develop to implement more realistic, immersive, and interactive simulations [23].
AR in the medical field is quite useful (in regards to displaying dynamic navigation), despite some software and hardware limitations [16]. Simulated images are generated faster and more realistically due to the development of computing power [24]. In the future, AR will likely serve as a navigation system for surgeries, working in symbiosis with surgeons, allowing them to perform accurate surgeries [6]. Despite our case report, VR simulation and AR navigation systems showed potential application in orthognathic surgery in this study. This means that we will be able to make surgical plans using VR for all surgeries, performing more convenient and accurate surgeries using AR navigation systems. This study has some limitations. First, this is a case report, so it is hard to evaluate the statistical analysis. Second, the equipment, such as tracking camera and computer, were common items in our office; therefore, if we use a high quality tracking system and computer, it will provide us with better results in terms of performance and accuracy.

Conclusions
In this study, preoperative planning for orthognathic surgery was performed in a virtual space. It was shown that VR simulation could provide easy handling of 3D models, for adjusting the maxilla and mandible with intuition. Orthognathic surgery was performed with an AR navigation system, which provided the operator with the preoperatively planned 3D model superimposed with the actual surgical site. The differences between the preoperative plan and postoperative 3D model were 1.18 mm at ANS, 4.22 mm at PNS, 2.05 mm at Right MxM1, 2.02 mm at Left MxM1, 2.34 mm at Pog, 4.32 mm at Right MnM1, and 4.88 mm at Left MnM1. The Mean value was 3.00 mm and the standard deviation was 1.44 mm. The deviation in our findings was measured as somewhat high. This difference is considered a depth problem; results that are more accurate can be obtained when depth cameras and optical markers are used. In the future, we need to apply haptic-related VR simulation; it will provide us with great advantages for 3D medical simulations, with accurate manipulation and immersiveness. AR navigation has great potential for medical applications; it has many advantages, including displaying a real-time augmented 3D model of the patient, and it is easily applied in the surgical field, without complicated 3D simulations or 3D-printed surgical guides.