Background
Over the last three decades, surgical practice has undergone a significant change with a move towards minimally invasive surgery (MIS) as the standard of care [1]. Although this has brought with it significant benefits, problems have also been associated with the advent of MIS. Perhaps the most substantial limitation associated with MIS is the loss of haptic feedback; this deficit is at its most extreme in robot-assisted surgery, where at present such feedback is lost entirely [2].
The image-enhanced operating environment looks to mitigate for the loss of haptic feedback by providing the surgeon with visual cues to the subsurface anatomy. The use of intraoperative image guidance can be divided into that used for operative planning, to facilitate the rapid identification of critical anatomical structures, for example, and that used for task execution, an example of which is tumour resection [2]. These two steps have very different requirements, with the first needing a large amount of anatomical information to be displayed without the need to account for tissue deformation or accurate registration, while the second requires less information to be displayed, but with much greater spatial accuracy.
Methods
The solution proposed herein, the image-enhanced operating environment, utilises two different imaging modalities and plays on their respective strengths to meet the differing needs of the two outlined steps of planning and execution. The platform has been built around the index procedure of robot-assisted partial nephrectomy, although its potential application extends well beyond this scope.
The first step of operative planning utilises 3D reconstructions of preoperative cross-sectional imaging manipulated via a tablet-based interface [3]. This information was displayed to the surgeon both on the tablet and within the da Vinci console using the stereo TilePro™ function (Intuitive Surgical, Sunnyvale, CA).
The second step of execution utilises optically registered intraoperative ultrasound. Using a live imaging modality mitigates for the problems of deformation often faced when trying to use preoperative imaging for high precision guidance. The ultrasound data is used to create freehand 3D reconstructions which are overlaid onto the operative view [4].
Results
To date, over 60 cases have been undertaken using the tablet-based planning component of the image enhanced operating environment. Over the course of this series, a subjective benefit has been demonstrated through the analysis of prospectively-collected questionnaire results. In addition, the platform has demonstrated objective safety, with no detrimental effects observed on outcome parameters. The use of registered ultrasound has been demonstrated in vivo [5], with results of an ex vivo study demonstrating potential efficacy awaited.
Conclusions
Replacing haptic feedback with visual cues to subsurface anatomy offers a number of potential direct and indirect benefits to the patient, including improved resection quality and a reduction in positive surgical margins. In addition to these direct benefits, the use of an image-enhanced operating environment could potentially influence case selection, where surgeons are prepared to take on cases with more challenging anatomy via a minimally invasive approach, because of the improved understanding they are given by the image guidance platform.
References
Nicolau S, Soler L, Mutter D, Marescaux J: Augmented reality in laparoscopic surgical oncology. Surg Oncol. 2011, 20: 189-201. 10.1016/j.suronc.2011.07.002.
Hughes-Hallett A, Pratt P, Mayer E, Martin S, Darzi A, Vale J, Marcus H, Cundy T: Augmented Reality Partial Nephrectomy: Examining the Current Status and Future Perspectives. Urology. 2014, 83: 266-273. 10.1016/j.urology.2013.08.049.
Hughes-Hallett A, Pratt P, Mayer E, Martin S, Darzi A, Vale J: Image guidance for all - TileProTM display of three-dimensionally reconstructed images in robotic partial nephrectomy. Urology. 2014, 84: 237-42. 10.1016/j.urology.2014.02.051.
Pratt P, Hughes-Hallett A, Di Marco A, Cundy T, Mayer E, Vale J, Darzi A, Yang G-Z: Multimodal Reconstruction for Image-Guided Interventions. Proceedings of the Hamlyn Symposium. 2013, 59-60.
Hughes-Hallett A, Pratt P, Mayer E, Di Marco A, Yang G-Z, Vale J, Darzi A: Intraoperative Ultrasound Overlay in Robot-assisted Partial Nephrectomy: First Clinical Experience. Eur Urol. 2013, 65: 671-672.
Acknowledgements
The authors are grateful for support from the NIHR Biomedical Research Centre funding scheme.
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
This article is published under an open access license. Please check the 'Copyright Information' section either on this page or in the PDF for details of this license and what re-use is permitted. If your intended use exceeds what is permitted by the license or if you are unable to locate the licence and re-use information, please contact the Rights and Permissions team.
About this article
Cite this article
Hughes-Hallett, A., Pratt, P., Dilley, J. et al. Augmented reality: 3D image-guided surgery. Cancer Imaging 15 (Suppl 1), O8 (2015). https://doi.org/10.1186/1470-7330-15-S1-O8
Published:
DOI: https://doi.org/10.1186/1470-7330-15-S1-O8