A Framework for Interactive Immersion into Imaging Data Using Augmented Reality

Date

2022-04-25

Journal Title

Journal ISSN

Volume Title

Publisher

Abstract

Image acquisition scanners produce an ever-growing amount of 3D/4D multimodal data that requires extensive image analytics and visualization of collected and generated information. For the latter, augmented reality (AR) with head-mounted displays (HMDs) has been commended as a potential enhancement. This PhD describes a framework (FI3D) for interactive and immersive experiences using an AR interface powered with image processing and analytics. The FI3D was designed to communicate with peripherals, including imaging scanners and HMDs, and to provide computational power for data acquisition and processing. The core of the FI3D is deployed in a dedicated unit that executes the computationally demanding processes in real-time; the HMD is used as an IO interface of the system. The FI3D is customizable and allows users to integrate different workflows while incorporating third party libraries. Using the FI3D as a foundation, two applications were developed in the cardiac and urology medical domains to experiment with, test, and validate the system. First, cine MRI images were segmented using a machine learning model while simultaneously an HMD rendered the reconstructed surfaces. Secondly, a simulated environment for robotic assisted MRI-guided transrectal prostate biopsies was developed, and user studies were conducted to evaluate the feasibility of AR visualization and interaction using the HoloLens HMD. Performance results showed that the system can maintain an image stream of five images with a resolution of 512 x 512 per second and update visual properties of the holograms at 1 update per 16 milliseconds. Interactive studies showed that using a gaming joystick allowed the manipulation of a robotic structure more effectively than using holographic menus or a mouse and keyboard. The FI3D can serve as the foundation for medical applications that benefit from AR visualization, removing various technical challenges from the developmental pipeline. The versatility, immersive, and interactive experience offered by the AR interface may assist physicians with diagnosis and image-guided interventions, resulting in safer and faster procedures. This can further increase the accessibility of healthcare to the public, yielding an increase in patient throughput.

Description

Keywords

Augmented reality, Medical imaging, Visualization, Immersive and interactive interfaces, Computational framework

Citation

Portions of this document appear in: Velazco-Garcia, Jose D., Nikhil V. Navkar, Shidin Balakrishnan, Julien Abinahed, Abdulla Al-Ansari, Georges Younes, Adham Darweesh et al. "Preliminary evaluation of robotic transrectal biopsy system on an interventional planning software." In 2019 IEEE 19th international conference on Bioinformatics and Bioengineering (BIBE), pp. 357-362. IEEE, 2019; and in: Velazco-Garcia, Jose D., Dipan J. Shah, Ernst L. Leiss, and Nikolaos V. Tsekos. "A modular and scalable computational framework for interactive immersion into imaging data with a holographic augmented reality interface." Computer Methods and Programs in Biomedicine 198 (2021): 105779; and in: Velazco‐Garcia, Jose D., Nikhil V. Navkar, Shidin Balakrishnan, Julien Abi‐Nahed, Khalid Al‐Rumaihi, Adham Darweesh, Abdulla Al‐Ansari et al. "End‐user evaluation of software‐generated intervention planning environment for transrectal magnetic resonance‐guided prostate biopsies." The International Journal of Medical Robotics and Computer Assisted Surgery 17, no. 1 (2021): 1-12; and in: Velazco‐Garcia, Jose D., Nikhil V. Navkar, Shidin Balakrishnan, Georges Younes, Julien Abi‐Nahed, Khalid Al‐Rumaihi, Adham Darweesh et al. "Evaluation of how users interface with holographic augmented reality surgical scenes: interactive planning MR‐Guided prostate biopsies." The International Journal of Medical Robotics and Computer Assisted Surgery 17, no. 5 (2021): e2290.