Next Article in Journal
Detecting Cocircular Subsets of a Spherical Set of Points
Previous Article in Journal
Deep Learning-Based Automatic Detection of Ships: An Experimental Study Using Satellite Images
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Practical Application of Augmented/Mixed Reality Technologies in Surgery of Abdominal Cancer Patients

by
Vladimir M. Ivanov
1,*,
Anton M. Krivtsov
1,
Sergey V. Strelkov
1,
Anton Yu. Smirnov
1,
Roman Yu. Shipov
1,
Vladimir G. Grebenkov
2,
Valery N. Rumyantsev
2,
Igor S. Gheleznyak
3,
Dmitry A. Surov
2,
Michail S. Korzhuk
2,4 and
Valery S. Koskin
5
1
Higher School of Theoretical Mechanics and Mathematical Physics, Peter the Great Saint Petersburg Polytechnic University, 195251 St. Petersburg, Russia
2
Department & Clinic of Naval Surgery, Military Medical Academy Named after S. M. Kirov, Academic Lebedev Street 6, 194044 St. Petersburg, Russia
3
Department & Clinic of Roentgenology & Radiology, Military Medical Academy Named after S. M. Kirov, Academic Lebedev Street 6, 194044 St. Petersburg, Russia
4
Department of General Surgery, Omsk State Medical University, ul. Lenina, 12, 644099 Omsk, Russia
5
Department & Clinic of Military Field Surgery, Military Medical Academy Named after S. M. Kirov, Academic Lebedev Street 6, 194044 St. Petersburg, Russia
*
Author to whom correspondence should be addressed.
J. Imaging 2022, 8(7), 183; https://doi.org/10.3390/jimaging8070183
Submission received: 11 April 2022 / Revised: 19 June 2022 / Accepted: 22 June 2022 / Published: 30 June 2022
(This article belongs to the Topic Augmented and Mixed Reality)

Abstract

:
The technology of augmented and mixed reality (AR/MR) is useful in various areas of modern surgery. We considered the use of augmented and mixed reality technologies as a method of preoperative planning and intraoperative navigation in abdominal cancer patients. Practical use of AM/MR raises a range questions, which demand suitable solutions. The difficulties and obstacles we encountered in the practical use of AR/MR are presented, along with the ways we chose to overcome them. The most demonstrative case is covered in detail. The three-dimensional anatomical model obtained from the CT scan needed to be rigidly attached to the patient’s body, and therefore an invasive approach was developed, using an orthopedic pin fixed to the pelvic bones. The pin is used both similarly to an X-ray contrast marker and as a marker for augmented reality. This solution made it possible, not only to visualize the anatomical structures of the patient and the border zone of the tumor, but also to change the position of the patient during the operation. In addition, a noninvasive (skin-based) marking method was developed that allows the application of mixed and augmented reality during operation. Both techniques were used (8 clinical cases) for preoperative planning and intraoperative navigation, which allowed surgeons to verify the radicality of the operation, to have visual control of all anatomical structures near the zone of interest, and to reduce the time of surgical intervention, thereby reducing the complication rate and improving the rehabilitation period.

1. Introduction

Abdominal cancer patients have a high level of incidence of disease, with subsequent indications for surgical treatment. One of the most important positions is the zone of the primary location of the tumor, its volume, and direction/deepness of invasion. Moreover, the radicality of the performed surgical intervention affects the risk of recurrence and the need for reoperation [1,2]. All this determines the need for an accurate preoperative and intraoperative visualization of both the tumor itself and the anatomical structures surrounding it, taking into account their topography and the individual characteristics of the patient, especially if there was a previous surgical intervention in this anatomical zone [3,4].
As such, the issue of the development and implementation of modern methods of preoperative and intraoperative tumor and patient anatomy imaging, in order to select an effective surgical approach, remains open and requires a comprehensive discussion.
The technology of augmented and mixed reality (AR/MR) has been used in medicine for quite a long time. The current stage of medical use of AR/MR is characterized by an irreversible transition, from an educational and training tool [5,6,7,8], to the category of a surgical instrument that is used before and during surgery [9,10,11,12,13] or other medical procedures [14]. This transition is not the same in different areas of surgery [6,9,10,11,12,13,15]. In addition, researchers assessment of the role and place of AR/MR at various stages of a surgical treatment varies [14,16,17,18]. Moreover, the range of subgroups are growing within the general direction of augmented reality in surgery: transmission of optical images from video cameras of endovideosurgical or robotic devices to user’s devices of virtual reality [19], instead of traditional monitors; intraoperative transmission of images of introscopic examinations to user devices [6]; and, finally, broadcasting to augmented reality devices spatially combined with the optical image of the surgical field and the image of a three-dimensional patient model, including a dynamic one, created from introscopic data [20,21]. According to this option, the surgeon has the opportunity to have visual control of the internal anatomical structures located in the depths of the tissues. Returning to the “narrow” areas of surgery, it should be noted that publications on the application of AR/MR in cardiac surgery, urology, neurosurgery, and maxillofacial surgery are quite widely reported [10,12,16,17,22]. Publications on the use of AR/MR in abdominal surgery, especially in clinical cases with oncology, are rare [19,23,24,25].
However, the need for digital support for surgery is confirmed by the active discussion of the methods of stereotaxic surgery, including those in hybrid operating rooms, that allow performing an intraoperative CT scan [26,27,28]. Thus, the opportunities that AR/MR provides, both at the stage of preoperative planning, and especially in the intraoperative navigation are, in our opinion, extremely in demand for the surgery of primary and recurrent cancer patients; and the study of the results of applying these techniques is a part of AR/MR that is especially relevant.
The primary subject of our work is increasing the effectiveness of the surgical treatment of patients suffering from recurring malignancies of the belly and pelvis via AR/MR.
The secondary subjects are determining the problems of the perioperative application of AR/MR in early operated patients, to create an algorithm for the application AR, to determine the role of a multidisciplinary team in surgical AR/MR applications, to use invasive or non-invasive 3D model positioning, to use AR/MR during surgical procedures, to measure the deviation of a 3D model in an operative field, to estimate the results of the practical use AR/MR for difficult surgical procedures, and to share our experience for subsequent discussion.
The subsequent parts of this paper are organized as follows: Section 2 includes ethics-related statements, descriptions of research methods, clinical and technical aspects of the research material, and our algorithm of practical AR/MR. Section 3 contains a description of the action of our team regarding a surgical AR/MR application. Section 3.1 and Section 3.2 contain descriptions of clinical cases, with underlining of difficulties and obstacles we encountered in the practical use of AR/MR. Section 3.3 contains data concerning the accuracy of AR/MR in this series. Section 4 summarizes the paper as a whole and details its novel contributions.

2. Materials and Methods

In this publication, we use surgical and descriptive methods. We only used medical materials, equipment, procedures, and drugs approved in the Russian Federation in the examination and treatment of all our patients. Data of clinical cases, included in description, are: diagnosis, main clinical findings, details of AR-implementing in each case, volume and duration of surgical procedures, and our impressions and other. We used no statistical test, because our patients were very different, and statistical analysis is not our goal at this time.

Materials

The clinical part of this study’s material consists of a group of 8 patients. All patients had previous surgery. Seven were suffering from recurrent malignancies with various tumors in the abdominal and pelvic regions.
The technical part includes a hardware and software complex of augmented reality, including a personal computer running on Windows 10, Microsoft Hololens 2—mixed reality glasses, a positioning system with markers, and additional software. We used the open source 3D Slicer 4.13 software as a primary tool for DICOM data analysis and segmentation of patients’ anatomy. In addition, we developed two custom software applications. The first one was designed for the PC platform with the purpose of defining how 3D holograms were aligned with the patient using radiopaque markers and uploading these segmented 3D models to the glasses using Wi-Fi. The second program was developed for the Hololens glasses, in order to visualize this data and superimpose the 3D model using an in-built camera that tracks the marker position and orientation. We used Unity 2009.10 as a cross-platform development solution, which allowed us to design both applications for the PC and Hololens platforms.
The algorithm that we developed (Scheme 1) begins with the selection of the patient and providing a preoperative CT scan of the zone of interest.
The main concept is the comparison of the three-dimensional model of the patient’s anatomy with his position in the operating room. The 3D model is built by segmenting the CT data in the 3D Slicer. Augmented reality glasses are used to visualize a 3D model of the patient’s anatomy. The image in the glasses is formed by special software that allows loading the patient’s data into the glasses and linking it and comparing with a three-dimensional model using an optical label (marker) [21]. The original function of the software is an interface for the doctor, with the ability to select and disable images of anatomical structures at different stages of the operation. At the same time, control is provided by using hand gestures, which allows the user to configure the visualization of the model and maintain the sterility of the surgeon’s hands at the same time.
Two approaches were used to link the model to the patient using an optical marker. The first one is invasive using an orthopedic pin and the second is non-invasive, using adhesive magnets that act as a fixing support for the label (marker).

3. Results

Using the method of augmented and mixed reality, we performed surgical interventions (8 clinical cases) in patients with cancer of the abdominal region (7) and with postoperative complications. Each clinical case was analyzed by a multidisciplinary team, the result was the construction and application of a situation-specific three-dimensional model. The team included a surgeon, a radiology specialist, an engineer, and related specialists (vascular surgeon, neurosurgeon, etc.). For the preoperative planning and constructing 3D model of the patients, we designed the algorithm that is presented in Scheme 2. This is one of the key steps, where segmentation of anatomical structures is performed using stage-by stage analysis of the CT data.
We present in deep detail one of our clinical cases only, as the most demonstrative. Other cases were characterized by similar clinical features, and are presented in brief form. We emphasize the development process and individual difficulties and solutions.

3.1. Clinical Experience in Cases with Invasive Markers

Patient T., 61 years old. Complaints at admission about constant pain in the perineal region, frequent uncontrolled urination in small quantities, feeling of an overflowing bladder. From 2018 underwent surgery, irradiation, and chemotherapy for rectal cancer.
Diagnostic results:
According to the CT of the chest, abdomen, and pelvis, a recurrence of a tumor (50 × 71 × 71 mm) was detected with signs of spread to the lower parts of the ureters, to the wall of the bladder, to the coccygeal vertebrae, as well as signs of invasion into small branches of the internal iliac arteries and the internal anterior iliac veins. There were no signs of visceral metastases.
According to an MRI(magnetic resonance imaging) of the small pelvis, a tumor with an approximate size of 90 × 62 × 95 mm was discovered, with signs of invasion into the prostate gland, seminal vesicles, bladder, distal ureters, presacral tissue, and coccyx, with partial destruction of the vertebrae; a bilateral hydro ureter was also found (Figure 1).
Taking into account the clinical symptoms and research data, indications were formulated for performing infralevator pelvic evisceration with distal resection of the coccyx and sacrum at the S5 level, using augmented reality technology.
Our previously proposed method [21] for attaching a marker to a frame fixed on the skull turned out to be unsuitable for this case, due to the remoteness of the anatomical zones. We designed and used a threaded pin, which was implanted in the right upper anterior iliac spine with a seat for installing an augmented reality marker. Titanium was chosen as the material for the pin. The pin was implanted on the day before operation. Next, a follow-up CT scan was performed. At the stage of performing the preoperative CT, according to which a 3D model was built, the pin was used as a radiopaque marker for the CT-study.
The pin itself has a hexagonal base, with a threaded hole for fixing a removable marker, which allows one to install a sterilizable marker during the operation, as well as to use markers of various configurations, depending on the patient position on the operating table and the surgical access (Figure 2).
Based on the CT data of the abdomen and small pelvis, a virtual model of the small pelvic organs was built for additional visualization of the location and invasion of the tumor into the surrounding tissues and organs (Figure 3).
To visualize a 3D model of the patient’s anatomy in augmented reality glasses during the operation we developed special software, which allows loading the patient’s data into the glasses and to linking it to the three-dimensional model with the help of an optical marker.
The developed software uses as input data, a set of DICOM slices, which are obtained using a computed tomography scanner with preinstalled radiopaque markers. The software allows segmenting the area of interest manually and building a 3D model based on it. In addition, it is possible to load ready-made models built using different software products (e.g., 3D Slicer).
For the correct visualization of the display in the glasses, it is necessary to determine the position of the areas of interest (organs, vessels, tumor) relative to the tracking marker. The program calibrates the position and orientation of the marker using three X-ray contrast marks, which should be clearly visible. Calibration can be done automatically, according to the known configuration of the marker. Moreover, if it is impossible to use the automatic mode, a manual version is provided.
The described application running on a computer works in tandem with an application running in the glasses. After the calibration is completed, the data are prepared and transferred to the glasses for the visualization. In this case, the glasses and the computer should be connected to the same network. The application for the glasses determines the position of the marker in real space and aligns the position of the hologram according to the position obtained during the calibration.
In the clinical case, taking into account the two stages of the operation: position of the patient on the back, and turning over to the prone “jack knife” position, with the help of 3D printing method two separate markers were made for each of the two positions (Figure 4). Before turning the patient, the “abdominal” marker was removed from the pin, and a “perineosacral” marker was put in its place.
The augmented reality technique made it possible to clarify the borderzone of sacral resection, determine the points of tumor fixation, perform a full washout of the pelvic cavity, and perform final hemostasis (Figure 5).
After completing the abdominal stage of the surgical intervention, the “abdominal” marker was changed for the “perineosacral” one, and the patient was turned to the prone “jack knife” position. A stage refinement of the topographic and anatomical features of the pelvic tumor was carried out using a 3D augmented reality model, and the level of sacrum transection was corrected, followed by CT control (Figure 6).
As a result of the stage by stage mobilization of the tumor and transection of the sacrum, the tumor was removed as a single block (Figure 7). The postoperative period proceeded without any complications.
In total, three patients were operated on with the help of this technique of marker fixation (Table 1); an example of a preoperative projection of the patient’s anatomy and its intraoperative usage overlain on the surgical field is shown in Figure 8.

3.2. Clinical Experience in Cases with Noninvasive (Skinbased) Markers

Taking into account the disadvantages and features of the invasive attachment method, a noninvasive method of marking based on magnets was developed, which are attached to the skin and act as a fixation point for the marker and radiopaque marks (Figure 9).
This solution is based on the usage of neodymium magnets. Marker installation has several stages. First, the magnets are preinstalled on the guide frame, then it is pressed to the skin, together with the magnets. Due to the presence of an adhesive surface, after removing the frame, the magnets remain fixed on the skin at a predetermined position. However, the strength of the adhesive layer of the magnets is not sufficient for long-term fixation, so additional fixation with an adhesive film should be used. We used a 10 × 10 cm Suprasorb film, which covered the area of the magnets in excess and allowed them to be fixed for a long time (for up to 1 week). Then, when magnets are fixed, the patient undergoes a CT scan, where the magnets are used similarly to radiopaque markers for subsequent cooperation of the 3D model with the marker. The final step is to install a marker on these magnets for the operation. In this case, the marker is presterilized; the marker installation site with a moisture resistant film can also be sterilized.
This approach has a number of significant advantages. First of all, it is noninvasive and it gives the ability to install a marker in any anatomical zone. Due to the sufficiently strong degree of fixation of the magnets, the installation site can be covered with sterile material and the marker can be placed on top (Figure 10).
In addition, during the operation, the label can be removed and installed only when it becomes necessary to visualize the anatomical model in augmented reality glasses. On the other hand, the disadvantage of this solution may be a lesser accuracy, due to fixation to the movable surface of the skin. In the following patients (Table 2) who received surgical treatment, we used magnetic (on-skin) fixation of marks:
In total, five patients were operated on with magnetic (on-skin) fixation of marks (Table 2); an example of a preoperative projection of the patient’s anatomy and its intraoperative usage overlain on the surgical field is shown in Figure 11.

3.3. Hologram Positioning Accuracy

To assess the positioning accuracy of holograms in the augmented reality mode, a stand was designed consisting of three mutually perpendicular planes with millimeter markings applied on each of them. The size of the working area of the stand is 400 × 400 mm. The base itself was assembled on the basis of a metal frame with the ability to adjust the offset and inclination of each plane. Calibration and correction of the position of the planes was done using measuring tools: a square and ruler with a high accuracy class.
The algorithm for using the stand is based on setting a marker at the base of the stand coordinate system and visualizing 1-mm spheres on the stand planes in augmented reality. After the mixed reality glasses have recognized the marker, the observer marks the actual location of the sphere on the stand layout. The evaluation takes place in each plane of the stand in different positions of the observer relative to the marker, after which the displacement of the marked points is compared relative to the given coordinates. As a result, the average deviation within a radius of 250 mm from the marker was 2–3 mm (Figure 12).
It is also worth considering the error when comparing the marker with patients. In the first case, using an implanted pin, the error was 1–4 mm. Thus, the average accuracy level was 3–5 mm.
In the case of using a marker that is attached to magnets, the value of the matching error was approximately the same; however, in the case of skin displacement as a result of the use of wound dilators, the total error could reach up to 5 mm. If the marker was placed on the chest, the overall accuracy was up to 10 mm, the main reason being the displacement of the chest during breathing.

4. Discussion

The surgical interventions performed in the treatment of recurrent cancer of the abdomen and pelvis, and after other previous surgeries, are technically quite complex from a surgical perspective. They are associated with a high risk of unwanted damage of the anatomical structures, frequent massive intraoperative hemorrhage and blood-loss. They require, in addition to the high skill of the operating surgeon, thorough comprehensive preoperative planning and intraoperative visual support for verification of topographic and anatomical relationships in the area of the surgical intervention. Moreover, a lot of other questions can emerge depending on the personalized situation.
Difficulties that are associated with the individual characteristics of the patient can be successfully overcome with the help of forecasting and taking appropriate measures. Obviously, the labor costs of highly qualified personnel in the preparation and provision of an operation with this method are very significant. However, it seems to us that they are absolutely justified, since the main goal is achieved: increasing the radicality, while reducing the surgical trauma, which directly increases the effectiveness of the treatment.
In comparison to other solutions [10,12,16,17,22], the developed methods with the help of invasive and non-invasive markers provide flexibility for positioning 3D models of the anatomy on almost any body part. However, this approach has some limitations regarding the overall accuracy of hologram positioning, specifically using non-invasive magnetic markers. On top of that, the current solution does not allow the tracking of surgical tools, as the inbuilt camera struggles to track multiple markers at once and cannot maintain the same performance. These issues limit the technology mostly to use in preoperative and intraoperative planning, and do not allow it to be used as a navigation tool.
A range of recent publications concentrated attention on selected features of AR/MR, such as geometric accuracy [29,30]. In our opinion, geometric accuracy is a very important feature of any diagnostic and curative method, but not the only one. Noises and occlusions, and other obstacles, can reduce the surgical benefit of AR/MR [31]. We try to share our experience of detecting and overcoming difficulties of the practical use of AR/MR in surgery.
A short list of the proposed innovation:
  • Algorithm for the application of augmented reality in surgery
  • Algorithm for creating a three-dimensional model of a surgical procedure by a multidisciplinary team.
  • Bone fixation of a marker
  • Two (or more) marker design for position changes
  • Magnet fixation of a marker
  • Gestures control menu

5. Conclusions

The presented clinical cases and the algorithm for using augmented and mixed reality technology confirmed the fact that the use of this approach improves the accuracy of preoperative planning, and helps in determining the level of tumor spread and invasion into the surrounding anatomical structures, which increases the radicality of surgical operation and the safety of the performed surgical procedure.
In future studies, in order to increase the total positioning accuracy, we are planning to use an external optical tracking system with multiple infrared cameras. This would help us to track multiple markers at high frequency and to use a pointer or track specific surgical tool during the procedure. In addition, we are continuing to develop a non-invasive approach for marker attachment and to combine it with a new optical tracking system, where a patient’s registration and calibration will happen in the operating room using anatomical landmarks. Recalibration in the middle of the procedure can help to alleviate any errors after using wound dilators and other factors.

Author Contributions

Conceptualization, V.M.I., S.V.S., D.A.S. and V.S.K.; methodology, V.M.I., S.V.S., A.Y.S., R.Y.S., V.G.G. and I.S.G.; formal analysis, S.V.S., A.Y.S., R.Y.S., V.N.R. and A.M.K.; resources, V.M.I., D.A.S., I.S.G., A.Y.S., R.Y.S. and V.S.K.; data curation, S.V.S., A.M.K., V.N.R. and M.S.K.; writing—original draft preparation, S.V.S., V.N.R., V.G.G., M.S.K. and V.S.K.; writing—review and editing, S.V.S., I.S.G., D.A.S. and V.G.G.; supervision, V.M.I.; project administration, V.M.I., S.V.S., D.A.S. and V.S.K. All authors have read and agreed to the published version of the manuscript.

Funding

The research was funded by the Ministry of Science and Higher Education of the Russian Federation under the strategic academic leadership program ‘Priority 2030’ (Agreement 075-15-2021-1333 dated 30 September 2021).

Institutional Review Board Statement

All experimental methods were performed in accordance with the relevant international and national guidelines and regulations. All medical practices followed the Declaration of Helsinki on medical research involving human subjects. The study was approved by the Local Ethics Committee at the Military Medical Academy named after S.M. Kirov, protocol No. 247 from 26 January 2021. All patients gave informed consent to participate in the study and to publish the data and images in an online open-access publication.

Informed Consent Statement

Not applicable.

Data Availability Statement

The data reported in this study are available upon requested from the corresponding author.

Acknowledgments

We would like to thank company Medgital for providing technical equipment for the clinical stage of the study.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Kontovounisios, C.; Tekkis, P. Locally Advanced Disease and Pelvic Exenterations. Clin. Colon Rectal Surg. 2017, 30, 404–414. [Google Scholar] [CrossRef] [PubMed]
  2. Caprino, P.; Sacchetti, F.; Tagliaferri, L.; Gambacorta, M.A.; Potenza, A.E.; Pastena, D.; Sofo, L. Use of electrochemotherapy in a combined surgical treatment of local recurrence of rectal cancer. J. Surg. Case Rep. 2021, 2021, rjab403. [Google Scholar] [CrossRef] [PubMed]
  3. Jimenez-Rodriguez, R.M.; Yuval, J.B.; Sauve, C.-E.G.; Wasserman, I.; Aggarwal, P.; Romesser, P.B.; Crane, C.H.; Yaeger, R.; Cercek, A.; Guillem, J.G.; et al. Type of recurrence is associated with disease-free survival after salvage surgery for locally recurrent rectal cancer. Int. J. Color. Dis. 2021, 36, 2603–2611. [Google Scholar] [CrossRef]
  4. Rokan, Z.; Simillis, C.; Kontovounisios, C.; Moran, B.J.; Tekkis, P.; Brown, G. Systematic review of classification systems for locally recurrent rectal cancer. BJS Open 2021, 5, zrab024. [Google Scholar] [CrossRef] [PubMed]
  5. Alzouebi, I.A.; Saad, S.; Farmer, T.; Green, S. Is the use of augmented reality-assisted surgery beneficial in urological education? A systematic review. Curr. Urol. 2021, 15, 148–152. [Google Scholar] [CrossRef] [PubMed]
  6. Cartucho, J.; Shapira, D.; Ashrafian, H.; Giannarou, S. Multimodal mixed reality visualisation for intraoperative surgical guidance. Int. J. Comput. Assist. Radiol. Surg. 2020, 15, 819–826. [Google Scholar] [CrossRef]
  7. Hakky, T.S.; Dickey, R.M.; Srikishen, N.; Lipshultz, L.I.; Spiess, P.E.; Carrion, R.E. Augmented reality assisted surgery: A urologic training tool. Asian J. Androl. 2016, 18, 732–734. [Google Scholar] [CrossRef]
  8. Tang, K.S.; Cheng, D.L.; Mi, E.; Greenberg, P.B. Augmented reality in medical education: A systematic review. Can. Med. Educ. J. 2019, 11, e81–e96. [Google Scholar] [CrossRef]
  9. Wake, N.; Rosenkrantz, A.B.; Huang, W.C.; Wysock, J.S.; Taneja, S.S.; Sodickson, D.K.; Chandarana, H. A workflow to generate patient-specific three-dimensional augmented reality models from medical imaging data and example applications in urologic oncology. 3D Print. Med. 2021, 7, 34. [Google Scholar] [CrossRef]
  10. Coelho, G.; Rabelo, N.N.; Vieira, E.; Mendes, K.; Zagatto, G.; de Oliveira, R.S.; Raposo-Amaral, C.E.; Yoshida, M.; de Souza, M.R.; Fagundes, C.F.; et al. Augmented reality and physical hybrid model simulation for preoperative planning of metopic craniosynostosis surgery. Neurosurg. Focus 2020, 48, E19. [Google Scholar] [CrossRef] [Green Version]
  11. Sparwasser, P.M.; Schoeb, D.; Miernik, A.; Borgmann, H. Augmented Reality und Virtual Reality im Operationssaal—Status Quo und Quo vadis. Aktuelle Urol. 2018, 49, 500–508. [Google Scholar] [CrossRef] [PubMed]
  12. Alsofy, S.Z.; Nakamura, M.; Suleiman, A.; Sakellaropoulou, I.; Saravia, H.W.; Shalamberidze, D.; Salma, A.; Stroop, R. Cerebral Anatomy Detection and Surgical Planning in Patients with Anterior Skull Base Meningiomas Using a Virtual Reality Technique. J. Clin. Med. 2021, 10, 681. [Google Scholar] [CrossRef] [PubMed]
  13. Thomas, D.J. Augmented reality in surgery: The Computer-Aided Medicine revolution. Int. J. Surg. 2016, 36, 25. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  14. Leuze, C.; Zoellner, A.; Schmidt, A.R.; Cushing, R.E.; Fischer, M.J.; Joltes, K.; Zientara, G.P. Augmented reality visualization tool for the future of tactical combat casualty care. J. Trauma Acute Care Surg. 2021, 91, S40–S45. [Google Scholar] [CrossRef]
  15. Eckert, M.; Volmerg, J.S.; Friedrich, C.M. Augmented Reality in Medicine: Systematic and Bibliographic Review. JMIR mHealth uHealth 2019, 7, e10967. [Google Scholar] [CrossRef]
  16. Mikhail, M.; Mithani, K.; Ibrahim, G.M. Presurgical and Intraoperative Augmented Reality in Neuro-Oncologic Surgery: Clinical Experiences and Limitations. World Neurosurg. 2019, 128, 268–276. [Google Scholar] [CrossRef]
  17. Bartella, A.; Kamal, M.; Scholl, I.; Schiffer, S.; Steegmann, J.; Ketelsen, D.; Hölzle, F.; Lethaus, B. Virtual reality in preoperative imaging in maxillofacial surgery: Implementation of “the next level”? Br. J. Oral Maxillofac. Surg. 2019, 57, 644–648. [Google Scholar] [CrossRef]
  18. Chu, M.W.; Moore, J.; Peters, T.; Bainbridge, D.; Mccarty, D.; Guiraudon, G.M.; Wedlake, C.; Lang, P.; Rajchl, M.; Currie, M.E.; et al. Augmented Reality Image Guidance Improves Navigation for Beating Heart Mitral Valve Repair. Innov. Technol. Tech. Cardiothorac. Vasc. Surg. 2012, 7, 274–281. [Google Scholar] [CrossRef]
  19. Huber, T.; Hadzijusufovic, E.; Hansen, C.; Paschold, M.; Lang, H.; Kneist, W. Head-Mounted Mixed-Reality Technology During Robotic-Assisted Transanal Total Mesorectal Excision. Dis. Colon Rectum 2019, 62, 258–261. [Google Scholar] [CrossRef]
  20. Devernay, F.; Mourgues, F.; Coste-Maniere, E. Towards endoscopic augmented reality for robotically assisted minimally invasive cardiac surgery. In Proceedings of the International Workshop on Medical Imaging and Augmented Reality, Hong Kong, China, 10–12 June 2001; pp. 16–20. [Google Scholar]
  21. Ivanov, V.M.; Krivtsov, A.M.; Strelkov, S.V.; Kalakutskiy, N.V.; Yaremenko, A.I.; Petropavlovskaya, M.Y.; Portnova, M.N.; Lukina, O.V.; Litvinov, A.P. Intraoperative Use of Mixed Reality Technology in Median Neck and Branchial Cyst Excision. Future Internet 2021, 13, 214. [Google Scholar] [CrossRef]
  22. Sadeghi, A.H.; el Mathari, S.; Abjigitova, D.; Maat, A.P.M.; Taverne, Y.J.J.; Bogers, A.J.C.; Mahtab, E.A. Current and Future Applications of Virtual, Augmented, and Mixed Reality in Cardiothoracic Surgery. Ann. Thorac. Surg. 2020, 113, 681–691. [Google Scholar] [CrossRef] [PubMed]
  23. Guerriero, L.; Quero, G.; Diana, M.; Soler, L.; Agnus, V.; Marescaux, J.; Corcione, F. Virtual Reality Exploration and Planning for Precision Colorectal Surgery. Dis. Colon Rectum 2018, 61, 719–723. [Google Scholar] [CrossRef] [PubMed]
  24. Pérez-Serrano, N.; Fernando Trebolle, J.; Sánchez Margallo, F.M.; Blanco Ramos, J.R.; García Tejero, A.; Aguas Blasco, S. Digital 3-Dimensional Virtual Models in Colorectal Cancer and Its Application in Surgical Practice. Surg. Innov. 2019, 27, 246–247. [Google Scholar] [CrossRef] [PubMed]
  25. Kontovounisios, C.; Tekkis, P.P.; Bello, F. 3D imaging and printing in pelvic colorectal cancer: ‘The New Kid on the Block’. Tech. Coloproctol. 2018, 23, 171–173. [Google Scholar] [CrossRef] [Green Version]
  26. Atallah, S.; Nassif, G.; Larach, S. Stereotactic navigation for TAMIS-TME: Opening the gateway to frameless, image-guided abdominal and pelvic surgery. Surg. Endosc. 2014, 29, 207–211. [Google Scholar] [CrossRef]
  27. Kwak, J.-M.; Romagnolo, L.; Wijsmuller, A.; Gonzalez, C.; Agnus, V.; Lucchesi, F.R.; Melani, A.G.F.; Marescaux, J.; Dallemagne, B. Stereotactic Pelvic Navigation With Augmented Reality for Transanal Total Mesorectal Excision. Dis. Colon Rectum 2019, 62, 123–129. [Google Scholar] [CrossRef] [PubMed]
  28. Kawada, K.; Hasegawa, S.; Okada, T.; Hida, K.; Okamoto, T.; Sakai, Y. Stereotactic navigation during laparoscopic surgery for locally recurrent rectal cancer. Tech. Coloproctol. 2017, 21, 977–978. [Google Scholar] [CrossRef]
  29. Pokhrel, S.; Alsadoon, A.; Prasad, P.W.C.; Paul, M. A novel augmented reality (AR) scheme for knee replacement surgery by considering cutting error accuracy. Int. J. Med. Robot. Comput. Assist. Surg. 2018, 15, e1958. [Google Scholar] [CrossRef] [Green Version]
  30. Mitra, R.L.; Greenstein, S.A.; Epstein, L.M. An algorithm for managing QT prolongation in coronavirus disease 2019 (COVID-19) patients treated with either chloroquine or hydroxychloroquine in conjunction with azithromycin: Possible benefits of intravenous lidocaine. Heart Rhythm Case Rep. 2020, 6, 244–248. [Google Scholar] [CrossRef]
  31. Basnet, B.R.; Alsadoon, A.; Withana, C.; Deva, A.; Paul, M. A novel noise filtered and occlusion removal: Navigational accuracy in augmented reality-based constructive jaw surgery. Oral Maxillofac. Surg. 2018, 22, 385–401. [Google Scholar] [CrossRef]
Scheme 1. Algorithm for the application of augmented reality in surgical treatment.
Scheme 1. Algorithm for the application of augmented reality in surgical treatment.
Jimaging 08 00183 sch001
Scheme 2. Algorithm for creating a three-dimensional model of a surgical procedure with a multidisciplinary team.
Scheme 2. Algorithm for creating a three-dimensional model of a surgical procedure with a multidisciplinary team.
Jimaging 08 00183 sch002
Figure 1. (a) MRI of the small pelvis with intravenous amplification—frontal projection; (b) MRI of the pelvis with intravenous enhancement—sagittal view (zone of tumor is circled).
Figure 1. (a) MRI of the small pelvis with intravenous amplification—frontal projection; (b) MRI of the pelvis with intravenous enhancement—sagittal view (zone of tumor is circled).
Jimaging 08 00183 g001
Figure 2. (a) Titanium pin with the zone for marker attachment; (b) View of patient with implanted pin; (c) Bone reference point—anterior superior iliac spine on the right.
Figure 2. (a) Titanium pin with the zone for marker attachment; (b) View of patient with implanted pin; (c) Bone reference point—anterior superior iliac spine on the right.
Jimaging 08 00183 g002aJimaging 08 00183 g002b
Figure 3. (a) CT scan of the pelvis with intravenous enhancement. Tumor size 50 × 71 × 71 mm, with signs of spread to the lower parts of the ureters, the wall of the bladder, and the coccygeal vertebrae (highlighted in a circle); (b) 3D model of pelvic organs and vessels. Implanted pin at the right side.
Figure 3. (a) CT scan of the pelvis with intravenous enhancement. Tumor size 50 × 71 × 71 mm, with signs of spread to the lower parts of the ureters, the wall of the bladder, and the coccygeal vertebrae (highlighted in a circle); (b) 3D model of pelvic organs and vessels. Implanted pin at the right side.
Jimaging 08 00183 g003
Figure 4. (a) Two configurations of the marker for augmented reality; (b) Sterilized and installed marker of the first (“abdominal”) configuration.
Figure 4. (a) Two configurations of the marker for augmented reality; (b) Sterilized and installed marker of the first (“abdominal”) configuration.
Jimaging 08 00183 g004
Figure 5. (a) Intraoperative view of the pelvis. Performed control of bleeding. 1a—ureters; 2a—common iliac arteries; (b) Resection of the ureters was performed, the bladder was mobilized, and the border of the tumor was determined using augmented reality technology. 1b—mobilized bladder; 2b—common iliac arteries. The boundary of the tumor invasion into the sacrum is indicated by a ligature and a dotted line.
Figure 5. (a) Intraoperative view of the pelvis. Performed control of bleeding. 1a—ureters; 2a—common iliac arteries; (b) Resection of the ureters was performed, the bladder was mobilized, and the border of the tumor was determined using augmented reality technology. 1b—mobilized bladder; 2b—common iliac arteries. The boundary of the tumor invasion into the sacrum is indicated by a ligature and a dotted line.
Jimaging 08 00183 g005
Figure 6. (a) The borders of the sacrum resection are outlined in the patient in “jackknife” position; (b) Postoperative CT scan.
Figure 6. (a) The borders of the sacrum resection are outlined in the patient in “jackknife” position; (b) Postoperative CT scan.
Jimaging 08 00183 g006
Figure 7. (a) Surgeon at work with augmented reality glasses; (b,c) Visualization of the 3D anatomy model and physician interface in the augmented reality glasses.
Figure 7. (a) Surgeon at work with augmented reality glasses; (b,c) Visualization of the 3D anatomy model and physician interface in the augmented reality glasses.
Jimaging 08 00183 g007
Figure 8. 3D model of the patient anatomy reconstruction and its overlay during the operation. According to Table 1, patient 2 image is above, patient 3 image is below.
Figure 8. 3D model of the patient anatomy reconstruction and its overlay during the operation. According to Table 1, patient 2 image is above, patient 3 image is below.
Jimaging 08 00183 g008
Figure 9. First stage—magnets are placed on the guide frame’s sockets. The protective cover for the adhesive layer on each magnet is removed. Second stage—magnets are installed on the skin using the guide frame. Third stage—magnets are covered with protective film and the patient is ready for the CT scan. Fourth stage—the sterile marker is attached on these magnets for augmented reality during surgical procedure.
Figure 9. First stage—magnets are placed on the guide frame’s sockets. The protective cover for the adhesive layer on each magnet is removed. Second stage—magnets are installed on the skin using the guide frame. Third stage—magnets are covered with protective film and the patient is ready for the CT scan. Fourth stage—the sterile marker is attached on these magnets for augmented reality during surgical procedure.
Jimaging 08 00183 g009
Figure 10. Moment of intraoperative marker placement in sterile conditions.
Figure 10. Moment of intraoperative marker placement in sterile conditions.
Jimaging 08 00183 g010
Figure 11. 3D model of the reconstruction of the anatomy of the patient and its overlay during the operation. According to Table 2, patient 3 is above, patient 1 is below.
Figure 11. 3D model of the reconstruction of the anatomy of the patient and its overlay during the operation. According to Table 2, patient 3 is above, patient 1 is below.
Jimaging 08 00183 g011
Figure 12. The stand for measuring positioning accuracy (left). The marker is placed in predefined spot (center). Three different 1 mm spheres are visualized on the each side of the stand. The deviation around 2–3 mm can be seen on the right image by comparing the pointer position (where the sphere should be) with actual sphere position (red dot). The step of the grid is 1 mm.
Figure 12. The stand for measuring positioning accuracy (left). The marker is placed in predefined spot (center). Three different 1 mm spheres are visualized on the each side of the stand. The deviation around 2–3 mm can be seen on the right image by comparing the pointer position (where the sphere should be) with actual sphere position (red dot). The step of the grid is 1 mm.
Jimaging 08 00183 g012
Table 1. Patients with invasive (pin) fixation of marks.
Table 1. Patients with invasive (pin) fixation of marks.
IDDiagnosisOperationDuration, min
1Cancer of the lower rectum cT3N1M0/pT3N1aM0 Lvi(+)
IIIB st. 1
Infralevator evisceration of the small pelvis, resection of the small intestine, cholecystectomy, resection of the coccyx, bilateral ureterocutaneostomy.380
2Cancer of the lower ampulla of the rectum pT3cN1aM0 IIIB st. 1 Infralevator evisceration of the small pelvis with distal resection of the coccyx and sacrum at the S5 level (removal of the tumor by a single block with the bladder, prostate gland and the distal part of the sacrum).335
3Ovarian cancer T3cNxM1 IVa st.1Combined cytoreductive (initially optimal) operation: posterior supralevator evisceration of the small pelvis. Resection of the right dome of the diaphragm. Resection of the greater and lesser omentum. Obstructive resection of the sigmoid colon. Total peritonectomy. Cholecystectomy, splenectomy, appendectomy. Resection of the right ureter (Figure 8).390
1 All patients were suffering from recurrence malignancies and had had surgery before. TNM Classification of Malignant Tumor (TNM-stage) is matched to previous surgery.
Table 2. Patients with noninvasive (on-skin) fixation of marks.
Table 2. Patients with noninvasive (on-skin) fixation of marks.
IDDiagnosisOperationDuration, min
1Bladder cancer T4bN3M1
(Lim) IV 1
Radical cystectomy. Lymph node dissection along the aorta, in the aorto-oval space, and along the course of the iliac vessels.330
2Cervical cancer pT1b2N0M0 Ib st. Recurrence with the formation of a vesico-vaginal and recto-vaginal fistula 1Pelvic evisceration (radical cystectomy with formation of Bricker ileum conduit, resection of the vaginal stump, anterior resection of the rectum).90
3Cervical cancer cT2bN0M0 IIb st 1Diagnostic laparotomy (revision of the abdominal cavity, revealed multiple areas of carcinomatosis throughout the abdominal cavity, performing a radical operation was technically impossible).110
4Hepaticocholedochal strictureLaparotomy. The rehepaticojejunostomy with a long Roux loop.270
5Recurrent chondrosarcoma of the 11th rib on the left with spread to the dome of the diaphragm, left kidney 1Not completed due to clinical exacerbation.N/A
1 All patients were suffering from recurrent malignancies and had had surgery before. TNM-stage is matched to previous surgery.
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Ivanov, V.M.; Krivtsov, A.M.; Strelkov, S.V.; Smirnov, A.Y.; Shipov, R.Y.; Grebenkov, V.G.; Rumyantsev, V.N.; Gheleznyak, I.S.; Surov, D.A.; Korzhuk, M.S.; et al. Practical Application of Augmented/Mixed Reality Technologies in Surgery of Abdominal Cancer Patients. J. Imaging 2022, 8, 183. https://doi.org/10.3390/jimaging8070183

AMA Style

Ivanov VM, Krivtsov AM, Strelkov SV, Smirnov AY, Shipov RY, Grebenkov VG, Rumyantsev VN, Gheleznyak IS, Surov DA, Korzhuk MS, et al. Practical Application of Augmented/Mixed Reality Technologies in Surgery of Abdominal Cancer Patients. Journal of Imaging. 2022; 8(7):183. https://doi.org/10.3390/jimaging8070183

Chicago/Turabian Style

Ivanov, Vladimir M., Anton M. Krivtsov, Sergey V. Strelkov, Anton Yu. Smirnov, Roman Yu. Shipov, Vladimir G. Grebenkov, Valery N. Rumyantsev, Igor S. Gheleznyak, Dmitry A. Surov, Michail S. Korzhuk, and et al. 2022. "Practical Application of Augmented/Mixed Reality Technologies in Surgery of Abdominal Cancer Patients" Journal of Imaging 8, no. 7: 183. https://doi.org/10.3390/jimaging8070183

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop