Next Article in Journal / Special Issue
Automated Assembly Using 3D and 2D Cameras
Previous Article in Journal
Bin-Dog: A Robotic Platform for Bin Management in Orchards
Previous Article in Special Issue
A New Combined Vision Technique for Micro Aerial Vehicle Pose Estimation
Due to scheduled maintenance work on our core network, there may be short service disruptions on this website between 16:00 and 16:30 CEST on September 25th.
Article

Augmented Reality Guidance with Multimodality Imaging Data and Depth-Perceived Interaction for Robot-Assisted Surgery

1
Singapore Institute of Manufacturing Technology, 138634, Singapore
2
Department of Mechanical Engineering, National University of Singapore, 117575, Singapore
*
Author to whom correspondence should be addressed.
Robotics 2017, 6(2), 13; https://doi.org/10.3390/robotics6020013
Received: 30 March 2017 / Revised: 18 May 2017 / Accepted: 22 May 2017 / Published: 24 May 2017
(This article belongs to the Special Issue Robotics and 3D Vision)
Image-guided surgical procedures are challenged by mono image modality, two-dimensional anatomical guidance and non-intuitive human-machine interaction. The introduction of Tablet-based augmented reality (AR) into surgical robots may assist surgeons with overcoming these problems. In this paper, we proposed and developed a robot-assisted surgical system with interactive surgical guidance using tablet-based AR with a Kinect sensor for three-dimensional (3D) localization of patient anatomical structures and intraoperative 3D surgical tool navigation. Depth data acquired from the Kinect sensor was visualized in cone-shaped layers for 3D AR-assisted navigation. Virtual visual cues generated by the tablet were overlaid on the images of the surgical field for spatial reference. We evaluated the proposed system and the experimental results showed that the tablet-based visual guidance system could assist surgeons in locating internal organs, with errors between 1.74 and 2.96 mm. We also demonstrated that the system was able to provide mobile augmented guidance and interaction for surgical tool navigation. View Full-Text
Keywords: image-guided surgery; augmented reality; augmented interaction; tablet computer; image registration image-guided surgery; augmented reality; augmented interaction; tablet computer; image registration
Show Figures

Figure 1

MDPI and ACS Style

Wen, R.; Chng, C.-B.; Chui, C.-K. Augmented Reality Guidance with Multimodality Imaging Data and Depth-Perceived Interaction for Robot-Assisted Surgery. Robotics 2017, 6, 13. https://doi.org/10.3390/robotics6020013

AMA Style

Wen R, Chng C-B, Chui C-K. Augmented Reality Guidance with Multimodality Imaging Data and Depth-Perceived Interaction for Robot-Assisted Surgery. Robotics. 2017; 6(2):13. https://doi.org/10.3390/robotics6020013

Chicago/Turabian Style

Wen, Rong, Chin-Boon Chng, and Chee-Kong Chui. 2017. "Augmented Reality Guidance with Multimodality Imaging Data and Depth-Perceived Interaction for Robot-Assisted Surgery" Robotics 6, no. 2: 13. https://doi.org/10.3390/robotics6020013

Find Other Styles
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Article Access Map by Country/Region

1
Back to TopTop