Publication:
Augmented Reality in Image-Guided Therapy to Improve Surgical Planning and Guidance

No Thumbnail Available
Identifiers
Publication date
2023-11
Defense date
2023-11-13
Journal Title
Journal ISSN
Volume Title
Publisher
Impact
Google Scholar
Export
Research Projects
Organizational Units
Journal Issue
Abstract
The latest technological advancements have propelled image-guided therapy (IGT) to achieve remarkable progress in enhancing surgical outcomes. Traditional commercial navigation systems have been long used to provide surgical guidance by offering accurate pose information of surgical tools in relation to the patient. However, cost, and physical requirements limit their availability in many treatment scenarios. Moreover, they show information on external screens, diverting physicians' attention away from the patient. In contrast, augmented reality (AR) emerges as a promising solution, harnessing the potential to provide more affordable and space-efficient alternatives that deliver intuitive and immersive experiences within surgical scenarios. As such, this thesis explores the integration of this technology in the medical field, following IGT methodology to enhance surgical planning and guidance. AR technology can be deployed onto cost-effective, hand-held devices, offering a shared view of virtual information. Conversely, head-mounted displays (HMD), commonly called AR glasses, provide a more intuitive and immersive experience to the wearer. Previous works have explored AR solutions to enhance surgical workflows. However, their implementations often prove too specific for the targeted task or are hardly transferable to real clinical practice. In this thesis, we present universal solutions for both types of AR devices that aim to enhance surgical workflows. Our approaches have been meticulously developed in collaboration with expert clinicians, always incorporating real clinical experience, including patients' data, to ensure the robustness and direct translation of our advancements into clinical practice. We began by exploring the implementation of AR technology in microtia correction procedures using flat-screen devices. Accompanied by 3D printing technology, the AR application facilitates the precise creation and placement of a reconstructed ear within a patient. The approach's usability and accuracy were validated through controlled experiments, showcasing significant improvements over traditional methods, and achieving results comparable to state-of-the-art AR projection systems. Notably, the system underwent testing during an actual surgery, and the outcome demonstrated a divergence of only 2.7 ± 2.4 mm from the ideal plan. This minimal error accounts for morphometric deviations resulting from inflammation and other issues intended to be addressed in a subsequent stage of surgery. Consequently, we can confidently assert that the overall error induced by the AR system remains negligible, firmly supporting the adoption of hand-held AR for surgical guidance in similar scenarios. Excited by the promising results of our methodology, we transferred it to a HMD and explored the capabilities these devices offer to interact with virtual information. To kickstart, we integrated the AR tracking method presented in the previous chapter on the two generations of Microsoft's seethrough AR headsets. Employing patient-specific phantoms from orthopedic oncological surgeries, we calculated the AR projection accuracy attained by both models. Furthermore, we evaluated the influence of the technical enhancements in the second model to endorse its use for surgical guidance. The favorable outcomes gleaned from our experimental analysis, coupled with the positive feedback received from surgeons, encouraged us to further our research in this domain. Then, we sought to develop an alternative method for automatic registration between the virtual and real worlds. Specifically, we compared the AR tracking capabilities offered by the Vuforia SDK with those provided by blob detection algorithms and image processing from depth camera information. We designed and conducted experiments both in simulation and surgical scenarios. Both tracking methods exhibited sufficiently low enough errors, affirming their suitability for clinical tasks. Consequently, we concluded that the choice between these methods can be based on the specific application requirements and the available resources. Despite the significant achievements in our previous work, we recognized that the limited computational capabilities of AR glasses compared to computers impose constraints on developing highly complex AR applications. In the final chapter of this thesis, we aimed to leverage the integration of 3D models into the real world offered by HMD and the medical image processing capabilities of 3D Slicer image computing platform. To achieve this, we utilized OpenIGTLink to establish a seamless communication link between both platforms, enabling the transfer of geometrical transformations and images in real time. The resulting application allows users to intuitively manipulate a virtual plane over a 3D model of a patient using their hands and simultaneously viewing real-time resliced CT images received from 3D Slicer. We tested this application in the clinical context of pedicle screw placement planning, although experimental outcomes indicated that the system could be readily adaptable to almost any other clinical application. The key contribution lies in the potential for easy communication with other HMD devices for collaborative decision-making, or even with other OpenIGTLink-compatible devices, such as conventional tracking systems. This integration not only may enhance the precision of the AR glasses but also broaden the scope of possible applications. Overall, our results will facilitate advanced surgical practices and seamless collaboration among medical professionals. Overall, the thesis demonstrates the transformative potential of integrating AR technology into surgical workflows, offering enhanced precision and improved outcomes.
Description
Mención Internacional en el título de doctor
Keywords
Medical image processing, Image-guided therapy, Augmente3D bioprintingd reality
Bibliographic citation
Collections