Next Article in Journal
Shear Wave Elastography in the Detection of Sinusoidal Obstruction Syndrome in Adult Patients Undergoing Allogenic Hematopoietic Stem Cell Transplantation
Next Article in Special Issue
A Surgical Pen-Type Probe Design for Real-Time Optical Diagnosis of Tumor Status Using 5-Aminolevulinic Acid
Previous Article in Journal
Non-Invasive Evaluation of Cerebral Microvasculature Using Pre-Clinical MRI: Principles, Advantages and Limitations
Previous Article in Special Issue
Interpretation of Near-Infrared Imaging in Acute and Chronic Wound Care
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Design and Testing of Augmented Reality-Based Fluorescence Imaging Goggle for Intraoperative Imaging-Guided Surgery

1
Institute of Global Health Technology, College of Health Science, Korea University, Seoul 02841, Korea
2
Department of Biomedical Sciences, College of Medicine, Korea University, Seoul 02841, Korea
3
Department of Thoracic and Cardiovascular Surgery, Korea University Guro Hospital, College of Medicine, Korea University, Seoul 08308, Korea
4
Department of Bio-convergence Engineering, College of Health Science, Korea University, Seoul 02841, Korea
5
Department of Interdisciplinary Bio/Micro System Technology, College of Engineering, Korea University, Seoul 02841, Korea
6
Department of Bioengineering, College of Health Science, Korea University, Seoul 02841, Korea
7
Interdisciplinary Program in Precision Public Health, Korea University, Seoul 02841, Korea
*
Authors to whom correspondence should be addressed.
These authors contributed equally to this work.
Diagnostics 2021, 11(6), 927; https://doi.org/10.3390/diagnostics11060927
Submission received: 30 March 2021 / Revised: 13 May 2021 / Accepted: 19 May 2021 / Published: 21 May 2021
(This article belongs to the Special Issue Fluorescence Optical Imaging)

Abstract

:
The different pathways between the position of a near-infrared camera and the user’s eye limit the use of existing near-infrared fluorescence imaging systems for tumor margin assessments. By utilizing an optical system that precisely matches the near-infrared fluorescence image and the optical path of visible light, we developed an augmented reality (AR)-based fluorescence imaging system that provides users with a fluorescence image that matches the real-field, without requiring any additional algorithms. Commercial smart glasses, dichroic beam splitters, mirrors, and custom near-infrared cameras were employed to develop the proposed system, and each mount was designed and utilized. After its performance was assessed in the laboratory, preclinical experiments involving tumor detection and lung lobectomy in mice and rabbits by using indocyanine green (ICG) were conducted. The results showed that the proposed system provided a stable image of fluorescence that matched the actual site. In addition, preclinical experiments confirmed that the proposed system could be used to detect tumors using ICG and evaluate lung lobectomies. The AR-based intraoperative smart goggle system could detect fluorescence images for tumor margin assessments in animal models, without disrupting the surgical workflow in an operating room. Additionally, it was confirmed that, even when the system itself was distorted when worn, the fluorescence image consistently matched the actual site.

1. Introduction

During cancer surgery, although preoperative imaging techniques such as computed tomography (CT) and positron emission tomography (PET) have a meaningful impact on preoperative planning, the surgeon’s eyes and hands remain the decisive factors [1,2,3]. It can be difficult discriminating between malignant and normal tissues during actual clinical practice [4,5]. This may result in incomplete resection or unnecessary normal tissue resection [6]. Recently, indocyanine green (ICG)-based near-infrared (NIR) fluorescence imaging technology has been actively used to observe various types of cancer [7,8,9,10,11] and lymph nodes [12,13], as well as important structures, blood vessel formation [14,15], and blood perfusion [16], in real time during surgery. Furthermore, in previous studies, our group has employed ICG for applications such as sentinel lymph node (SLN) detection [17,18,19], assessment of lung segments [14], gastric conduit perfusion [20], and lung cancer detection [10]. However, as these systems display information on a remote monitor, surgeons are required to look at the monitor in order to identify the NIR fluorescence image [21,22,23]. This interrupts the surgeons’ attention, thereby increasing the probability of errors and the overall surgery time [24,25].
To overcome the abovementioned limitations, intraoperative systems with a head mount display (HMD) were developed (Table 1). Most existing HMDs are binocular and provide an immersive viewing experience [26]. Video see-through HMDs use a small display to show images from a computer or a camera and thus, create an immersive and virtual reality environment. However, the lack of direct visual access to the scenes interferes with the surgeons’ ability to assess and resect tissue spontaneously during surgery. Optical see-through HMDs are based on augmented reality (AR); in these devices, see-through displays are used to project images directly in the user’s field-of-view (FOV) [27,28,29], creating an AR environment [26,30]. Using this system, the user can view both the projected image and the object itself. Additionally, it supports visual tissue assessment and resection [31]. However, previous optical see-through systems are composed of optoelectrical components, making them bulky and difficult to wear. These features reduce the surgeon’s manipulability and block the surgeon’s vision. Furthermore, different pathways exist between the camera and the eye position. Therefore, it was necessary to use additional components or algorithms to match the user’s view field and the fluorescence image from the camera.
In this study, we developed an augmented reality-based fluorescence imaging (ARFI) goggle system that synchronizes the user’s view and the NIR camera’s view, in order to decrease related costs and improve the efficiency of the system. To assess its clinical applicability, the developed system was tested through preclinical experiments using a cancer model for guiding tumor resection in mice and rabbits.

2. Materials and Methods

2.1. Hardware Design for the Arfi System

The prototype of the ARFI system is presented in Figure 1. The proposed ARFI system consists of a customized NIR camera with the OS05A20 sensor (CMOS, Omnivision, Santa Clara, CA, USA) for fluorescence imaging, a commercial smart glass (Moverio BT-300, Epson, Suwa-shi, Nagano, Japan) for real-time display of the fusion image, and an optical system for utilizing the smart glass’s curved mirror. The CMOS camera is used to acquire a fluorescence image with a resolution of 2688 × 1944 pixels and a frame rate of 60 frames per second. The ARFI system was constructed by attaching the NIR camera onto the smart glass hardware, as shown in Figure 1. To acquire the fluorescence images, an 814–870 nm bandpass filter (FF01-842/56-25, Semrock, Rochester, NY, USA) was attached in front of the NIR camera to block all incoming light, except for the fluorescence emission from ICG. The smart glass system was equipped with two display panels (0.43˝ wide panel), each with a resolution of 1280 × 720 pixels. After processing the acquired images, the surgical and fluorescence images could be merged on the display. The optical system comprised a 20 mm square dichroic beam splitter (DBS, #62-628, Edmund Optics, Barrington, NJ, USA), a 20 mm protected gold coated 90° specialty mirror (#65-849, Edmund Optics, Barrington, NJ, USA), and mounts (DBS, mirror, and NIR camera mounts). Each mount was modeled using a 3D modeling program (Fusion 360, Autodesk, San Rafael, CA, USA) and printed with a 3D printer. These mounts could shift easily because they were connected via rods (SR6, SR3-P4, Thorlabs, Newton, NJ, USA). To co-register a fluorescence image with a surgical scene image, the DBS mount employed a small spring, screws, and a ball to rotate along the three axes; it could be adjusted horizontally to match the position of the user’s eye. The NIR camera mount was grooved to allow for heat release, and it could be adjusted vertically. The central mirror mount secured the triangular mirror and the smart glass; it also linked the other mounts via the rod, as shown in Figure 1. Owing to these mounts, the system could be customized depending on the user; this increased the accuracy of the fluorescence image registration.

2.2. Operational Principle of Arfi System

The schematic of the ARFI system is shown in Figure 2a. The excitation light source for ICG was used by attaching a 790 nm short-pass filter (FF01-790/SP-25; Semrock, Rochester, NY, USA) in front of an LED source (M780L3, Thorlabs, Newton, NJ, USA) with a 780 nm peak, and focused on the surgical field. This filter was used only to detect the fluorescence emitted by ICG on blocking light with a similar wavelength as the fluorescence reflected from the sample. White light from an array of four Luxeon III white light diodes was used (Lumileds, San Jose, CA, USA); all NIR wavelengths were filtered out from this light (E680SP, Chroma). The generated fluence rates for NIR excitation and white light were 0–5 mW/cm2 and 0–1 mW/cm2, respectively. The distance between the source and surgical field was approximately 250 mm. The DBS transmitted visible light enabling the surgeon to view the actual surgical scene, while also reflecting the near-infrared rays entering the same optical path as that of the surgeon. In this case, the near-infrared rays are reflected again through the mirror and then continue toward NIR cameras, such that the NIR camera could acquire fluorescence images. Thus, NIR images were acquired on the same optical path as the actual surgical scene through an optical separation system, and they were then presented to the surgeon. The major benefit of using two different images that share the same optical path is that easy blending between the NIR and actual images can be achieved.
As demonstrated in Figure 2b, the operation of the entire system is as follows. The typical working distance to acquire a focused fluorescence image is approximately 500 mm. Fluorescence images acquired from the NIR cameras were delivered to a computer through a USB 3.0 port and subjected to post-processing. Subsequently, they were transmitted to the smart glass via Miracast [34], a wireless image transmission technique, and were then projected to the eyes in real time. As the real-time projected image is overlaid on the actual field, users can obtain useful information, including the margin of cancer, in real time. The acquisition of fluorescence signals from NIR cameras and the post-processing were achieved using Visual Studio 2017 (Microsoft, Redmond, WA, USA), C++, and OpenCV library. Camera grabbing, which refers to image acquisition via NIR cameras, was performed using OpenCV library functions. Signals above a specific threshold value were saved from removing background noise from the raw fluorescent images. Then, to improve the contrast with the surrounding area, the NIR fluorescence image was pseudo-colored in green and overlaid with 100% transparency over the color video image of the same surgical field. During the surgery, the surgeon cannot realign the hardware immediately. Therefore, the software can be used to manually adjust the magnification, reduction, and rotation of the fluorescence image to match the actual field.

2.3. In Vivo Animal Studies

This study was approved by the Institutional Animal Care and Use Committee of Korea University College of Medicine (IACUC approval number: KOREA-2016-0228). Six-week-old C57BL/6 mice (20–25 g; Orient Biotech, Seongnam-si, Gyeonggi-do, Korea) and female New Zealand white rabbits (2.5–3 kg; DooYeol Biotech Co Ltd., Seoul, Korea) were used. To assist the animals in adapting to their environment, all rabbits were housed in individual cages with freely available food and water for 1–2 weeks, according to existing human–animal care protocols. All animals were anesthetized intramuscularly before experiments with 5 mg/kg of zylazine (Rompun, Bayer Korea Inc., Seoul, Korea) and 5 mg/kg of alfaxalone (Alfaxan, Jurox Pty Ltd., Hunter Valley, NSW, Australia).

2.3.1. Mouse Subcutaneous Tumor Model

Lewis lung carcinoma (LLC) cells were used to establish the subcutaneous tumor in the mice. For in vivo experiments, LLC cells (20 μL of 2 × 106 cells/mL) were injected subcutaneously into the hind legs. The tumor model was established after 2–3 weeks. ICG (5 mg/kg; Daiichi-Sankyo Co., Tokyo, Japan) was injected into the tail vein. The fluorescence signal was observed using the ARFI system 12 h after injection.

2.3.2. Rabbit Lung Cancer Model

A rabbit lung cancer model was established as per a previous study [35]. Briefly, VX2 single-cell suspensions with Matrigel solutions were directly injected into the rabbit’s lung using a 28-gauge needle. The in vivo experiment was performed after two weeks. The VX2 model of the lung tumor was established 2–3 weeks after administration. ICG (5 mg/kg) was injected into the ear vein, and the fluorescence signal was observed using the ARFI system 12 h after injection. Normal rabbits were used for the detection of the intersegmental line. After ligating the right middle lobar pulmonary arteries and vein in each rabbit, 0.6 mg/kg of ICG (n = 3 for each group) was injected into the ear vein. The fluorescence signal was observed using the ARFI system after injection.

3. Results

3.1. System Evaluation

FOV tests and image-matching ratio tests were conducted to evaluate the performance of the proposed system. First, as shown in Figure 3a, an ICG of 128 μM was prepared and used in a cotton swab. The light source for exciting the ICG was prepared by attaching a 790 nm short-pass filter (FF01-790/SP-25; Semrock, Rochester, NY, USA) at the front of an LED source (M780L3, Thorlabs, Newton, NJ, USA) with a 780 nm peak. Figure 3b shows the results of the FOV test. The blue rectangles denote the FOV of the system, and the green light represents the fluorescence signal. The results showed that the system provides consistent matching images, which means that the fluorescence image is always matched on the IGG sample, even when the sample is moved. This indicates that the direction in which the user wears the system and moves their head does not influence the performance of the system.
Additionally, as Figure 4 shows, we calculated the matching ratio, which indicated how closely the real-field and fluorescence images were matched. First, the fluorescence signal was changed to orange to increase the contrast with the green ICG sample. Subsequently, to calculate the matching percentage, the two images were cropped to the same size in MATLAB (Mathworks, Natick, MA, USA) and converted to grayscale. Thereafter, the matching percentage was calculated using the corr2 function. We calculated the percentage five times in the system’s FOV and averaged the obtained values. The result showed an approximately 95.3% match. Each pixel intensity above the threshold was processed to have a value of 1 to ensure that the difference between the real part and the fluorescence image could be easily interpreted.

3.2. Tumor Detection Using Arfi System in Mouse Tumor Model

The mice had a subcutaneous tumor in the left thigh, with a mean tumor diameter of 0.5 ± 0.2 cm (range 0.4–0.6 cm). The tumor was successfully detected through the NIR fluorescence signal in all mouse models, as presented in Figure 5. In addition, only tumor and injection sites are highlighted in the ARFI system but invisible in the surrounding scene, which indicates that the ARFI system can identify tumor margins during a surgical procedure.

3.3. Tumor Detection Using Arfi System in Rabbit Lung Tumor Model

The rabbit lung tumor model was successfully established in all four rabbits. The mean tumor diameter was 0.8 ± 0.3 cm. In all the models, we successfully detected the lung tumor through the NIR fluorescence signal. As shown in Figure 6, in the in vivo and ex vivo images captured by the ARFI system, the NIR fluorescence matched lung tumor sites.
Furthermore, we clarified whether ARFI could evaluate the blood flow distribution in the lung lobe in real time, as shown in Figure 7. When ICG was injected intravenously, the resection area of the lung lobe where the blood vessel was ligated was easily distinguishable using ARFI. Consequently, a surgeon can easily resect the exact area.

4. Discussion

This study was conducted to develop an intraoperative fluorescence imaging system using the AR technique and validate the performance of this system through laboratory-level and preclinical studies. The proposed ARFI system was developed to address the FOV difference and inconvenience associated with existing fluorescence navigation systems. We devised a new optical system to solve these problems; this system can exactly match the fluorescence pathways with the visible light. Consequently, the user can directly view the surgical field through fluorescence images, without having to look at a computer display. Furthermore, the proposed ARFI system can guide complete tumor resection in mice and rabbits.
One of the main advantages of the proposed system is that it avoids complicated image-matching algorithms. Previous studies on see-through HMD systems have used two cameras to separate the visible light and NIR fluorescence light [29,36,37]. Therefore, the camera angles of the two separately acquired images could not be directly aligned. If this issue is not addressed, the projected fluorescence image could cause severe eye fatigue as the user would need to switch focus between the real image and the fluorescence image. Therefore, an additional image-processing algorithm was needed [38]. However, our system can be more consistent with fluorescence images to the actual area using the mount which can more precisely align, and simple algorithms which can adjust the size or rotate the angles of the supplied fluorescence images. Furthermore, the image remains aligned even when the wearer moves or the glass is tilted. This advantage has been confirmed not only at the laboratory level but also in various preclinical studies, such as mouse tumor models and rabbit lung tumor models (Supplementary Video S1). The pre-use calibration feature for the eye positions of different individuals provides consistent images during use, which ensures the stability required for use in an actual operating room. This feature potentially allows for a quicker and safer procedure and could consequently enhance patient safety.
Despite the achievements, however, this study has some limitations. Although the effectiveness of tumor detection in animal models was confirmed, accurate detection of deep lesions has not yet been achieved. In addition, ICG can only be detected up to 1 cm below the skin. Therefore, certain recording positions may not be usable [39]. In this study, we detected steady-state fluorescence using a CMOS camera. However, steady-state fluorescence intensity has some limitations such as photobleaching of the NIR fluorophore, phototoxicity, and changes in fluorescence collection [40]. Therefore, for sensitive detection of tumors, it is expected that such a high fluence rate will be required. Additionally, through system evaluation, it was confirmed that the ARFI system has a high image-matching ratio. However, since the matching ratio between visible and fluorescence images can vary depending on the ICG concentration and fluence rate, the optimal ICG concentration and fluence rate are required to obtain an accurate fluorescence image during the surgery (Supplementary Note S1). Moreover, since our system does not have a visible-fluorescence image alignment algorithm, the visible and fluorescence images were manually matched. Therefore, a slight error inevitably occurs, and an algorithm that can match two images is additionally required for a perfect correlation. Although the tumor was confirmed through biopsy, tumor margin estimation should be considered for minimal incision and minimally invasive surgery [41].
Furthermore, the current system uses ready-made see-through HMD primarily designed for entertainment purposes. Although this type of display is useful for demonstrating the feasibility, we will develop a customized and ergonomic display unit optimized for medical imaging.

Supplementary Materials

The following are available online at https://www.mdpi.com/article/10.3390/diagnostics11060927/s1, Video S1: Lymphatic Distribution Analysis of ICG in Rabbit Model.

Author Contributions

Conceptualization, H.K.K., and B.-M.K.; Data curation, M.S.K., K.H.K., B.H.C., S.H.L. and Y.H.Q.; Formal analysis, S.H.L. and Y.H.Q.; Funding acquisition, B.-M.K.; Methodology, M.S.K. and Y.H.Q.; Supervision, H.K.K., and B.-M.K.; Visualization, S.H.L. and Y.H.Q.; Writing—original draft, S.H.L. and Y.H.Q.; Writing—review and editing, S.H.L., Y.H.Q., H.K.K., and B.-M.K. All authors have read and agreed to the published version of the manuscript.

Funding

This research was supported by the Technology Innovation Program (No. 20000676) funded by the Ministry of Trade, Industry & Energy (MOTIE, Korea).

Institutional Review Board Statement

The study was approved by the Institutional Animal Care and Use Committee of Korea University College of Medicine (IACUC approval number: KOREA-2016-0228).

Informed Consent Statement

Not applicable.

Data Availability Statement

Data sharing not applicable.

Conflicts of Interest

The authors declare that there is no conflict of interest regarding the publication of this paper.

References

  1. Cuevas, C.; Shibata, D. Medical imaging in the diagnosis and management of cancer pain. Curr. Pain Headache Rep. 2009, 13, 261–270. [Google Scholar] [CrossRef] [PubMed]
  2. Wagner, R.F.; Metz, C.E.; Campbell, G. Assessment of medical imaging systems and computer aids: A tutorial review. Acad. Radiol. 2007, 14, 723–748. [Google Scholar] [CrossRef]
  3. Gorpas, D.; Koch, M.; Anastasopoulou, M.; Bozhko, D.; Klemm, U.; Nieberler, M.; Ntziachristos, V. Multi-parametric standardization of fluorescence imaging systems based on a composite phantom. IEEE Trans. Biomed. Eng. 2019, 67, 185–192. [Google Scholar] [CrossRef] [Green Version]
  4. Collins, L.; Schnitt, S.; Achacoso, N.; Haque, R.; Nekhlyudov, L.; Fletcher, S.; Quesenberry, C.; Habel, L. Outcome of women with ductal carcinoma in situ (DCIS) treated with breast-conserving surgery alone: A case-control study of 225 patients from the Cancer Research Network. In Proceedings of the Laboratory Investigation, New York, NY, USA, 1 January 2009; pp. 34A–35A. [Google Scholar]
  5. Vicini, F.A.; Kestin, L.L.; Goldstein, N.S.; Chen, P.Y.; Pettinga, J.; Frazier, R.C.; Martinez, A.A. Impact of young age on outcome in patients with ductal carcinoma-in-situ treated with breast-conserving therapy. J. Clin. Oncol. 2000, 18, 296. [Google Scholar] [CrossRef] [PubMed]
  6. Vahrmeijer, A.L.; Hutteman, M.; Van Der Vorst, J.R.; Van De Velde, C.J.; Frangioni, J.V. Image-guided cancer surgery using near-infrared fluorescence. Nat. Rev. Clin. Oncol. 2013, 10, 507. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  7. Predina, J.D.; Keating, J.; Newton, A.; Corbett, C.; Xia, L.; Shin, M.; Frenzel Sulyok, L.; Deshpande, C.; Litzky, L.; Nie, S. A clinical trial of intraoperative near-infrared imaging to assess tumor extent and identify residual disease during anterior mediastinal tumor resection. Cancer 2019, 125, 807–817. [Google Scholar] [CrossRef] [PubMed]
  8. Keating, J.J.; Nims, S.; Venegas, O.; Jiang, J.; Holt, D.; Kucharczuk, J.C.; Deshpande, C.; Singhal, S. Intraoperative imaging identifies thymoma margins following neoadjuvant chemotherapy. Oncotarget 2016, 7, 3059. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  9. Keating, J.; Tchou, J.; Okusanya, O.; Fisher, C.; Batiste, R.; Jiang, J.; Kennedy, G.; Nie, S.; Singhal, S. Identification of breast cancer margins using intraoperative near-infrared imaging. J. Surg. Oncol. 2016, 113, 508–514. [Google Scholar] [CrossRef] [PubMed]
  10. Kim, H.K.; Quan, Y.H.; Choi, B.H.; Park, J.-H.; Han, K.N.; Choi, Y.; Kim, B.-M.; Choi, Y.H. Intraoperative pulmonary neoplasm identification using near-infrared fluorescence imaging. Eur. J. Cardio-Thorac. Surg. 2015, 49, 1497–1502. [Google Scholar] [CrossRef] [Green Version]
  11. DSouza, A.V.; Lin, H.; Henderson, E.R.; Samkoe, K.S.; Pogue, B.W. Review of fluorescence guided surgery systems: Identification of key performance capabilities beyond indocyanine green imaging. J. Biomed. Opt. 2016, 21, 080901. [Google Scholar] [CrossRef]
  12. Gilmore, D.M.; Khullar, O.V.; Jaklitsch, M.T.; Chirieac, L.R.; Frangioni, J.V.; Colson, Y.L. Identification of metastatic nodal disease in a phase 1 dose-escalation trial of intraoperative sentinel lymph node mapping in non–small cell lung cancer using near-infrared imaging. J. Thorac. Cardiovasc. Surg. 2013, 146, 562–570. [Google Scholar] [CrossRef] [Green Version]
  13. Imai, K.; Minamiya, Y.; Saito, H.; Nakagawa, T.; Ito, M.; Ono, T.; Motoyama, S.; Sato, Y.; Konno, H.; Ogawa, J.-I. Detection of pleural lymph flow using indocyanine green fluorescence imaging in non-small cell lung cancer surgery: A preliminary study. Surg. Today 2013, 43, 249–254. [Google Scholar] [CrossRef] [PubMed]
  14. Oh, Y.; Quan, Y.H.; Kim, M.; Kim, B.-M.; Kim, H.K. Intraoperative fluorescence image-guided pulmonary segmentectomy. J. Surg. Res. 2015, 199, 287–293. [Google Scholar] [CrossRef] [PubMed]
  15. Hsieh, C.-P.; Liu, Y.-H.; Wu, Y.-C.; Hsieh, M.-J.; Chao, Y.-K. Indocyanine green fluorescence-navigated robotic segmentectomy. Surg. Endosc. 2017, 31, 3347–3348. [Google Scholar] [CrossRef] [PubMed]
  16. Quan, Y.H.; Han, K.N.; Kim, H.K. Fluorescence Image-Based Evaluation of Gastric Tube Perfusion during Esophagogastrostomy. Korean J. Thorac. Cardiovasc. Surg. 2020, 53, 178. [Google Scholar] [CrossRef]
  17. Oh, Y.; Quan, Y.H.; Choi, Y.; Kim, C.K.; Kim, H.; Kim, H.K.; Kim, B.-M. Intraoperative combined color and fluorescent images–based sentinel node mapping in the porcine lung: Comparison of indocyanine green with or without albumin premixing. J. Thorac. Cardiovasc. Surg. 2013, 146, 1509–1515. [Google Scholar] [CrossRef] [Green Version]
  18. Oh, Y.; Lee, Y.-S.; Quan, Y.H.; Choi, Y.; Jeong, J.M.; Kim, B.-M.; Kim, H.K. Thoracoscopic color and fluorescence imaging system for sentinel lymph node mapping in porcine lung using indocyanine green-neomannosyl human serum albumin: Intraoperative image-guided sentinel nodes navigation. Ann. Surg. Oncol. 2014, 21, 1182–1188. [Google Scholar] [CrossRef]
  19. Kim, H.K.; Quan, Y.H.; Oh, Y.; Park, J.Y.; Park, J.-H.; Choi, Y.; Lee, Y.-S.; Jeong, J.M.; Choi, Y.H.; Kim, B.-M. Macrophage-targeted indocyanine green-neomannosyl human serum albumin for intraoperative sentinel lymph node mapping in porcine esophagus. Ann. Thorac. Surg. 2016, 102, 1149–1155. [Google Scholar] [CrossRef] [Green Version]
  20. Quan, Y.H.; Kim, M.; Kim, H.K.; Kim, B.-M. Fluorescent image-based evaluation of gastric conduit perfusion in a preclinical ischemia model. J. Thorac. Dis. 2018, 10, 5359. [Google Scholar] [CrossRef]
  21. Liu, Y.; Bauer, A.Q.; Akers, W.J.; Sudlow, G.; Liang, K.; Shen, D.; Berezin, M.Y.; Culver, J.P.; Achilefu, S. Hands-free, wireless goggles for near-infrared fluorescence and real-time image-guided surgery. Surgery 2011, 149, 689–698. [Google Scholar] [CrossRef] [Green Version]
  22. Troyan, S.L.; Kianzad, V.; Gibbs-Strauss, S.L.; Gioux, S.; Matsui, A.; Oketokoun, R.; Ngo, L.; Khamene, A.; Azar, F.; Frangioni, J.V. The FLARE™ intraoperative near-infrared fluorescence imaging system: A first-in-human clinical trial in breast cancer sentinel lymph node mapping. Ann. Surg. Oncol. 2009, 16, 2943–2952. [Google Scholar] [CrossRef] [Green Version]
  23. Liu, Y.; Akers, W.J.; Bauer, A.Q.; Mondal, S.; Gullicksrud, K.; Sudlow, G.P.; Culver, J.P.; Achilefu, S. Intraoperative detection of liver tumors aided by a fluorescence goggle system and multimodal imaging. Analyst 2013, 138, 2254–2257. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  24. Mondal, S.B.; Gao, S.; Zhu, N.; Sudlow, G.P.; Liang, K.; Som, A.; Akers, W.J.; Fields, R.C.; Margenthaler, J.; Liang, R. Binocular Goggle Augmented Imaging and Navigation System provides real-time fluorescence image guidance for tumor resection and sentinel lymph node mapping. Sci. Rep. 2015, 5, 12117. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  25. Badiali, G.; Ferrari, V.; Cutolo, F.; Freschi, C.; Caramella, D.; Bianchi, A.; Marchetti, C. Augmented reality as an aid in maxillofacial surgery: Validation of a wearable system allowing maxillary repositioning. J. Cranio-Maxillofac. Surg. 2014, 42, 1970–1976. [Google Scholar] [CrossRef]
  26. Rolland, J.P.; Fuchs, H. Optical versus video see-through head-mounted displays in medical visualization. Presence Teleoperators Virtual Environ. 2000, 9, 287–309. [Google Scholar] [CrossRef]
  27. Shao, P.; Ding, H.; Wang, J.; Liu, P.; Ling, Q.; Chen, J.; Xu, J.; Zhang, S.; Xu, R. Designing a wearable navigation system for image-guided cancer resection surgery. Ann. Biomed. Eng. 2014, 42, 2228–2237. [Google Scholar] [CrossRef] [Green Version]
  28. Zhang, Z.; Pei, J.; Wang, D.; Gan, Q.; Ye, J.; Yue, J.; Wang, B.; Povoski, S.P.; Martin, E.W., Jr.; Hitchcock, C.L. A wearable Goggle navigation system for dual-mode optical and ultrasound localization of suspicious lesions: Validation studies using tissue-simulating phantoms and an ex vivo human breast tissue model. PLoS ONE 2016, 11, e0157854. [Google Scholar] [CrossRef] [PubMed]
  29. Mondal, S.B.; Gao, S.; Zhu, N.; Habimana-Griffin, L.; Akers, W.J.; Liang, R.; Gruev, V.; Margenthaler, J.; Achilefu, S. Optical see-through cancer vision goggles enable direct patient visualization and real-time fluorescence-guided oncologic surgery. Ann. Surg. Oncol. 2017, 24, 1897–1903. [Google Scholar] [CrossRef]
  30. Maruyama, K.; Watanabe, E.; Kin, T.; Saito, K.; Kumakiri, A.; Noguchi, A.; Nagane, M.; Shiokawa, Y. Smart glasses for neurosurgical navigation by augmented reality. Oper. Neurosurg. 2018, 15, 551–556. [Google Scholar] [CrossRef]
  31. Ewers, R.; Schicho, K.; Undt, G.; Wanschitz, F.; Truppe, M.; Seemann, R.; Wagner, A. Basic research and 12 years of clinical experience in computer-assisted navigation technology: A review. Int. J. Oral Maxillofac. Surg. 2005, 34, 1–8. [Google Scholar] [CrossRef]
  32. Liu, Y.; Njuguna, R.; Matthews, T.; Akers, W.J.; Sudlow, G.P.; Mondal, S.B.; Tang, R.; Gruev, V.; Achilefu, S. Near-infrared fluorescence goggle system with complementary metal–oxide–semiconductor imaging sensor and see-through display. J. Biomed. Opt. 2013, 18, 101303. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  33. Mela, C.A.; Patterson, C.; Thompson, W.K.; Papay, F.; Liu, Y. Stereoscopic integrated imaging goggles for multimodal intraoperative image guidance. PLoS ONE 2015, 10, e0141956. [Google Scholar] [CrossRef] [PubMed]
  34. Alliance, W.-F. Miracast. Available online: https://www.wi-fi.org/discover-wi-fi/miracast (accessed on 19 October 2012).
  35. Choi, B.H.; Young, H.S.; Quan, Y.H.; Rho, J.; Eo, J.S.; Han, K.N.; Choi, Y.H.; Hyun Koo, K. Real-time computed tomography fluoroscopy-guided solitary lung tumor model in a rabbit. PLoS ONE 2017, 12, e0179220. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  36. Zhu, N.; Mondal, S.; Gao, S.; Achilefu, S.; Gruev, V.; Liang, R. Dual-mode optical imaging system for fluorescence image-guided surgery. Opt. Lett. 2014, 39, 3830–3832. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  37. Noll, M.; Noa-Rudolph, W.; Wesarg, S.; Kraly, M.; Stoffels, I.; Klode, J.; Spass, C.; Spass, G. ICG based augmented-reality-system for sentinel lymph node biopsy. In Proceedings of the Eurographics Workshop on Visual Computing for Biology and Medicine, Granada, Spain, 20–21 September 2018; pp. 11–15. [Google Scholar]
  38. Mondal, S.B.; Tsen, S.W.D.; Achilefu, S. Head-Mounted Devices for Noninvasive Cancer Imaging and Intraoperative Image-Guided Surgery. Adv. Funct. Mater. 2020, 30, 2000185. [Google Scholar] [CrossRef]
  39. Kraft, J.C.; Ho, R.J. Interactions of indocyanine green and lipid in enhancing near-infrared fluorescence properties: The basis for near-infrared imaging in vivo. Biochemistry 2014, 53, 1275–1283. [Google Scholar] [CrossRef] [Green Version]
  40. Homulle, H.; Powolny, F.; Stegehuis, P.; Dijkstra, J.; Li, D.-U.; Homicsko, K.; Rimoldi, D.; Muehlethaler, K.; Prior, J.; Sinisi, R. Compact solid-state CMOS single-photon detector array for in vivo NIR fluorescence lifetime oncology measurements. Biomed. Opt. Express 2016, 7, 1797–1814. [Google Scholar] [CrossRef] [Green Version]
  41. Stewart, H.L.; Hungerford, G.; Birch, D.J. Characterization of single channel liquid light guide coupling and SPAD array imaging for tumour margin estimation using fluorescence lifetime. Meas. Sci. Technol. 2020, 31, 125701. [Google Scholar] [CrossRef]
Figure 1. Components of the augmented reality-based fluorescence imaging goggle system.
Figure 1. Components of the augmented reality-based fluorescence imaging goggle system.
Diagnostics 11 00927 g001
Figure 2. (a) Schematic of the ARFI system; it is characterized by matching the pathways of NIR fluorescence light with the white light that is transmitted to the wearer’s eyes. (b) Operation flow of the entire system.
Figure 2. (a) Schematic of the ARFI system; it is characterized by matching the pathways of NIR fluorescence light with the white light that is transmitted to the wearer’s eyes. (b) Operation flow of the entire system.
Diagnostics 11 00927 g002
Figure 3. (a) ICG sample in cotton swab head, (b) FOV test result; blue rectangle is the FOV of the system, and green light is the fluorescence signal. In the blue rectangle, the fluorescence signal is always matched on the ICG sample.
Figure 3. (a) ICG sample in cotton swab head, (b) FOV test result; blue rectangle is the FOV of the system, and green light is the fluorescence signal. In the blue rectangle, the fluorescence signal is always matched on the ICG sample.
Diagnostics 11 00927 g003
Figure 4. Process of calculation the matching percentage of fluorescence image. (a) Cropped image of ICG sample, (b) gray scale image of ICG sample, (c) binarize of ICG sample image. Above the threshold, all pixel intensities were 1. (d) Cropped fluorescence image, (e) grayscale image of fluorescence, and (f) binarized fluorescence image. Above the threshold, all pixel intensities were 1.
Figure 4. Process of calculation the matching percentage of fluorescence image. (a) Cropped image of ICG sample, (b) gray scale image of ICG sample, (c) binarize of ICG sample image. Above the threshold, all pixel intensities were 1. (d) Cropped fluorescence image, (e) grayscale image of fluorescence, and (f) binarized fluorescence image. Above the threshold, all pixel intensities were 1.
Diagnostics 11 00927 g004
Figure 5. Subcutaneous tumors in the mice detected by the ARFI system. (a) Representative real image of the mouse model observed using the ARFI system, and (b) representative fluorescent image of the mouse model observed using the ARFI system.
Figure 5. Subcutaneous tumors in the mice detected by the ARFI system. (a) Representative real image of the mouse model observed using the ARFI system, and (b) representative fluorescent image of the mouse model observed using the ARFI system.
Diagnostics 11 00927 g005
Figure 6. Lung tumor in rabbit detected by the ARFI system. (a) Representative color image of rabbit lung tumor, (b) representative real image of lung tumor observed using the ARFI system.
Figure 6. Lung tumor in rabbit detected by the ARFI system. (a) Representative color image of rabbit lung tumor, (b) representative real image of lung tumor observed using the ARFI system.
Diagnostics 11 00927 g006
Figure 7. Distribution of blood flow in the lung lobes detected by the ARFI system. (a) Color image of rabbit lung before ligation of lobar pulmonary vein and artery, (b) color image of rabbit lung after ligation of lobar pulmonary vein and artery, (c) real image of blood flow distribution in the lung lobes observed using ARFI system; yellow dashed line indicates the pulmonary lobe of the ligated lobar vein and artery.
Figure 7. Distribution of blood flow in the lung lobes detected by the ARFI system. (a) Color image of rabbit lung before ligation of lobar pulmonary vein and artery, (b) color image of rabbit lung after ligation of lobar pulmonary vein and artery, (c) real image of blood flow distribution in the lung lobes observed using ARFI system; yellow dashed line indicates the pulmonary lobe of the ligated lobar vein and artery.
Diagnostics 11 00927 g007
Table 1. Overview of wearable fluorescence imaging systems.
Table 1. Overview of wearable fluorescence imaging systems.
AuthorsDisplay ModuleHardware DesignImageApplication
Y. Liu et al.
2011 [21]
Monocular night vision viewerCombined
Night vision viewer, White/NIR light source
Monochrome/fluorescence fusion imageSLN mapping (preclinical)
HCC imaging
(clinical)
Y. Liu et al.
2013 [32]
Binocular
HMD (ST1080, Silicon microdisplay)
Combined
HMD, CMOS Camera, NIR light source
Monochrome/fluorescence fusion image or natural visionSLN mapping
Liver cancer surgery
(preclinical)
P. Shao et al.
2014 [27]
Monocular HMD (Google glass, Google Labs)
Binocular HMD (Personal Cinema System, Headplay)
Non-combined H.M.D., CCD. camera, NIR light source
Combined HMD, CMOS camera
Non-combined C.C.D. camera, NIR light source
Fluorescence image superimposed on natural vision
Color/fluorescence fusion image
Phantom study
Mela CA et al.
2015 [33]
Binocular HMD (lab made)Combined HMD, four CMOS sensors
Non-combined hand-held microscopy, NIR light, ultrasound scanner
Color/fluorescence fusion imagePhantom study
S. B. Mondal et al.
2015 [24]
Binocular HMD (Carl Zeiss)Combined HMD, custom VIS-NIR camera
Non-combined NIR light
Color/fluorescence fusion imageOvarian cancer surgery
(preclinical)
SLN mapping
(clinical)
Zhang Z et al.
2016 [28]
Monocular HMD (Google glass, Google Labs)Non-combined H.M.D., CCD. camera, NIR light, ultrasound probeFluorescence image superimposed on natural visionSLN mapping
(clinical)
S. B. Mondal et al.
2017 [29]
Binocular HMD (Carl Zeiss)Combined HMD, custom VIS-NIR camera
Non-combined NIR light
Color/fluorescence fusion imageTumor resection (preclinical), SLN biopsy (clinical)
M. Keisuke et al. [30]Binocular HMD (Moverio BT-200, Epson)Combined HMD, optical markers
Non-combined motion capture cameras
Fluorescence image superimposed on natural visionBrain tumors
Our smart goggle systemBinocular HMD (Moverio BT-300, Epson)Combined HMD, CMOS camera, and optical system
Non-combined NIR light source
Fluorescence image superimposed on natural visionCancer detection, segmental line identification
(preclinical)
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Lee, S.H.; Quan, Y.H.; Kim, M.S.; Kwon, K.H.; Choi, B.H.; Kim, H.K.; Kim, B.-M. Design and Testing of Augmented Reality-Based Fluorescence Imaging Goggle for Intraoperative Imaging-Guided Surgery. Diagnostics 2021, 11, 927. https://doi.org/10.3390/diagnostics11060927

AMA Style

Lee SH, Quan YH, Kim MS, Kwon KH, Choi BH, Kim HK, Kim B-M. Design and Testing of Augmented Reality-Based Fluorescence Imaging Goggle for Intraoperative Imaging-Guided Surgery. Diagnostics. 2021; 11(6):927. https://doi.org/10.3390/diagnostics11060927

Chicago/Turabian Style

Lee, Seung Hyun, Yu Hua Quan, Min Sub Kim, Ki Hyeok Kwon, Byeong Hyeon Choi, Hyun Koo Kim, and Beop-Min Kim. 2021. "Design and Testing of Augmented Reality-Based Fluorescence Imaging Goggle for Intraoperative Imaging-Guided Surgery" Diagnostics 11, no. 6: 927. https://doi.org/10.3390/diagnostics11060927

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop