A Framework for Interactive Immersion into Imaging Data Using Augmented Reality

dc.contributor.advisorTsekos, Nikolaos V.
dc.contributor.advisorLeiss, Ernst L.
dc.contributor.committeeMemberEick, Christoph F.
dc.contributor.committeeMemberNavkar, Nikhil V.
dc.contributor.committeeMemberWebb, Andrew G.
dc.creatorVelazco, Jose D.
dc.creator.orcid0000-0002-1938-6770
dc.date.accessioned2023-05-26T15:12:28Z
dc.date.available2023-05-26T15:12:28Z
dc.date.createdMay 2022
dc.date.issued2022-04-25
dc.date.updated2023-05-26T15:12:29Z
dc.description.abstractImage acquisition scanners produce an ever-growing amount of 3D/4D multimodal data that requires extensive image analytics and visualization of collected and generated information. For the latter, augmented reality (AR) with head-mounted displays (HMDs) has been commended as a potential enhancement. This PhD describes a framework (FI3D) for interactive and immersive experiences using an AR interface powered with image processing and analytics. The FI3D was designed to communicate with peripherals, including imaging scanners and HMDs, and to provide computational power for data acquisition and processing. The core of the FI3D is deployed in a dedicated unit that executes the computationally demanding processes in real-time; the HMD is used as an IO interface of the system. The FI3D is customizable and allows users to integrate different workflows while incorporating third party libraries. Using the FI3D as a foundation, two applications were developed in the cardiac and urology medical domains to experiment with, test, and validate the system. First, cine MRI images were segmented using a machine learning model while simultaneously an HMD rendered the reconstructed surfaces. Secondly, a simulated environment for robotic assisted MRI-guided transrectal prostate biopsies was developed, and user studies were conducted to evaluate the feasibility of AR visualization and interaction using the HoloLens HMD. Performance results showed that the system can maintain an image stream of five images with a resolution of 512 x 512 per second and update visual properties of the holograms at 1 update per 16 milliseconds. Interactive studies showed that using a gaming joystick allowed the manipulation of a robotic structure more effectively than using holographic menus or a mouse and keyboard. The FI3D can serve as the foundation for medical applications that benefit from AR visualization, removing various technical challenges from the developmental pipeline. The versatility, immersive, and interactive experience offered by the AR interface may assist physicians with diagnosis and image-guided interventions, resulting in safer and faster procedures. This can further increase the accessibility of healthcare to the public, yielding an increase in patient throughput.
dc.description.departmentComputer Science, Department of
dc.format.digitalOriginborn digital
dc.format.mimetypeapplication/pdf
dc.identifier.citationPortions of this document appear in: Velazco-Garcia, Jose D., Nikhil V. Navkar, Shidin Balakrishnan, Julien Abinahed, Abdulla Al-Ansari, Georges Younes, Adham Darweesh et al. "Preliminary evaluation of robotic transrectal biopsy system on an interventional planning software." In 2019 IEEE 19th international conference on Bioinformatics and Bioengineering (BIBE), pp. 357-362. IEEE, 2019; and in: Velazco-Garcia, Jose D., Dipan J. Shah, Ernst L. Leiss, and Nikolaos V. Tsekos. "A modular and scalable computational framework for interactive immersion into imaging data with a holographic augmented reality interface." Computer Methods and Programs in Biomedicine 198 (2021): 105779; and in: Velazco‐Garcia, Jose D., Nikhil V. Navkar, Shidin Balakrishnan, Julien Abi‐Nahed, Khalid Al‐Rumaihi, Adham Darweesh, Abdulla Al‐Ansari et al. "End‐user evaluation of software‐generated intervention planning environment for transrectal magnetic resonance‐guided prostate biopsies." The International Journal of Medical Robotics and Computer Assisted Surgery 17, no. 1 (2021): 1-12; and in: Velazco‐Garcia, Jose D., Nikhil V. Navkar, Shidin Balakrishnan, Georges Younes, Julien Abi‐Nahed, Khalid Al‐Rumaihi, Adham Darweesh et al. "Evaluation of how users interface with holographic augmented reality surgical scenes: interactive planning MR‐Guided prostate biopsies." The International Journal of Medical Robotics and Computer Assisted Surgery 17, no. 5 (2021): e2290.
dc.identifier.urihttps://hdl.handle.net/10657/14269
dc.language.isoeng
dc.rightsThe author of this work is the copyright owner. UH Libraries and the Texas Digital Library have their permission to store and provide access to this work. UH Libraries has secured permission to reproduce any and all previously published materials contained in the work. Further transmission, reproduction, or presentation of this work is prohibited except with permission of the author(s).
dc.subjectAugmented reality
dc.subjectMedical imaging
dc.subjectVisualization
dc.subjectImmersive and interactive interfaces
dc.subjectComputational framework
dc.titleA Framework for Interactive Immersion into Imaging Data Using Augmented Reality
dc.type.dcmiText
dc.type.genreThesis
thesis.degree.collegeCollege of Natural Sciences and Mathematics
thesis.degree.departmentComputer Science, Department of
thesis.degree.disciplineComputer Science
thesis.degree.grantorUniversity of Houston
thesis.degree.levelDoctoral
thesis.degree.nameDoctor of Philosophy

Files

Original bundle

Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
VELAZCO-DISSERTATION-2022.pdf
Size:
4.24 MB
Format:
Adobe Portable Document Format

License bundle

Now showing 1 - 2 of 2
No Thumbnail Available
Name:
PROQUEST_LICENSE.txt
Size:
4.43 KB
Format:
Plain Text
Description:
No Thumbnail Available
Name:
LICENSE.txt
Size:
1.81 KB
Format:
Plain Text
Description: