Augmented Reality HMD Research

Contents

HMD for the First Assistant in robotic surgery

In robotic-assisted laparoscopic surgery, the first assistant (FA) stands at the bedside assisting the intervention, while the surgeon sits at the console teleoperating the robot. We are developing ARssist, an augmented reality application based on an optical see-through head-mounted display, to aid the FA. ARssist provides visualization of the robotic instruments and endoscope “inside” the patient body, and various ways to render the stereo endoscopy on the head-mounted display, as shown in the Figure below.

Publications

Greene, Nicholas; Luo, Wenkai; Kazanzides, Peter

dVPose: Automated Data Collection and Dataset for 6D Pose Estimation of Robotic Surgical Instruments Proceedings Article

In: IEEE Intl. Symp. on Medical Robotics (ISMR), 2023.

BibTeX

Qian, Long; Zhang, Xiran; Deguet, Anton; Kazanzides, Peter

ARAMIS: Augmented Reality Assistance for Minimally Invasive Surgery Using a Head-Mounted Display Proceedings Article

In: Medical Image Computing and Computer-Assisted Intervention (MICCAI), pp. 74-82, 2019.

BibTeX

Qian, Long; Deguet, Anton; Wang, Zerui; Liu, Yun-Hui; Kazanzides, Peter

Augmented Reality Assisted Instrument Insertion and Tool Manipulation for the First Assistant in Robotic Surgery Proceedings Article

In: IEEE Intl. Conf. on Robotics and Automation (ICRA), pp. 5173-5179, 2019.

BibTeX

Qian, Long; Deguet, Anton; Kazanzides, Peter

ARssist: augmented reality on a head-mounted display for the first assistant in robotic surgery Journal Article

In: IET Healthcare Technology Letters, vol. 5, no. 5, pp. 194-200, 2018.

Links | BibTeX

HMD for Training of Medical Procedures

We have been researching the application of augmented reality overlay on an HMD to support training of medical procedures. Our initial project was to support training of combat medics; for example, to perform a needle decompression procedure in response to a diagnosed tension pneumothorax. We developed application software, using the Unity3D framework, that reads an augmented workflow from a JSON file. More recently, we are applying this to training of neurosurgical procedures, such as ventriculostomy.

Publications

Azimi, Ehsan; Winkler, Alexander; Tucker, Emerson; Qian, Long; Doswell, Jayfus; Navab, Nassir; Kazanzides, Peter

Can Mixed-Reality Improve the Training of Medical Procedures? Proceedings Article

In: IEEE Engin. in Medicine and Biology Conf. (EMBC), pp. 4065-4068, 2018.

BibTeX

Azimi, Ehsan; Winkler, Alexander; Tucker, Emerson; Qian, Long; Sharma, Manyu; Doswell, Jayfus; Navab, Nassir; Kazanzides, Peter

Evaluation of Optical See-Through Head-Mounted Displays in Training for Critical Care and Trauma Proceedings Article

In: IEEE Virtual Reality (VR), pp. 96-97, 2018.

BibTeX

Calibration of Optical See-Through HMDs

For an optical see-through HMD, it is necessary to calibrate the display to the user’s eyes, so that augmented overlays are displayed in the correct locations with respect to the physical world. We have developed new methods for calibration of both mono and stereo display systems.

Publications

Azimi, Ehsan; Qian, Long; Kazanzides, Peter; Navab, Nassir

Robust Optical See-Through Head-Mounted Display Calibration: Taking Anisotropic Nature of User Interaction Errors into Account Proceedings Article

In: IEEE Virtual Reality, pp. 219-220, Los Angeles, CA, 2017.

BibTeX

Qian, Long; Winkler, Alexander; Fuerst, Bernhard; Kazanzides, Peter; Navab, Nassir

Modeling Physical Structure as Additional Constraints for Stereoscopic Optical See-Through Head-Mounted Display Calibration Proceedings Article

In: IEEE Intl. Symp. on Mixed and Augmented Reality (ISMAR), pp. 154-155, Merida, Mexico, 2016.

BibTeX

Qian, Long; Winkler, Alexander; Fuerst, Bernhard; Kazanzides, Peter; Navab, Nassir

Reduction of Interaction Space in Single Point Active Alignment Method for Optical See-Through Head-Mounted Display Calibration Proceedings Article

In: IEEE Intl. Symp. on Mixed and Augmented Reality (ISMAR), pp. 156-157, Merida, Mexico, 2016.

BibTeX

Sensor Fusion to Overcome Occlusions in HMD Applications

Augmented reality overlays require a method to track the position of the user’s head (HMD) with respect to objects in the environment. For example, when overlaying tumor margins for image-guided neurosurgery, it is necessary to know the location of the HMD with respect to the patient. This tracking is generally provided by an optical tracking system (i.e., one or more cameras), which can be mounted on the HMD (“inside-out” tracking) or externally (“outside-in” tracking). In either case, it is possible for the camera view to be fully or partially occluded. We developed a method to handle these occlusions by performing sensor fusion of measurements from the optical tracker and from an inertial measurement unit (IMU). The method estimates the bias of the inertial sensors when the optical tracker provides full 6 degree-of-freedom (DOF) pose information. As long as the position of at least one marker can be tracked by the optical system, the 3-DOF position can be combined with the orientation estimated from the inertial measurements to recover the full 6-DOF pose information. When all the markers are occluded, the position tracking relies on the inertial sensors which are bias-corrected by the optical tracking system. Our experiments demonstrate that this approach can effectively handle long periods (at least several minutes) of partial occlusion and relatively short periods (up to a few seconds) of total occlusion.

Publications

He, Changyu; Kazanzides, Peter; Sen, Hasan Tutkun; Kim, Sungmin; Liu, Yue

An Inertial and Optical Sensor Fusion Approach for Six Degree-of-Freedom Pose Estimation Journal Article

In: Sensors, vol. 15, no. 7, pp. 16448-16465, 2015.

BibTeX

He, Changyu; Sen, H. Tutkun; Kim, Sungmin; Sadda, Praneeth; Kazanzides, Peter

Fusion of Inertial Sensing to Compensate for Partial Occlusions in Optical Tracking Systems Proceedings Article

In: MICCAI Workshop on Augmented Environments for Computer-Assisted Interventions (AE-CAI), pp. 60-69, Springer LNCS 8678, Boston, MA, 2014.

BibTeX

HMD for Image-Guided Neurosurgery

In surgery, the goal is to enable a surgeon, wearing the HMD, to perform image-guided surgery while keeping hands and eyes focused on the patient. The developed system does not display high-resolution preoperative images on the HMD, but rather shows simple graphics derived from the navigation information. For example, the navigation information can include models of the patient anatomy that are obtained from preoperative images, such as biopsy target points and tumor outlines. The system presents one or more “picture-in-picture” virtual views of the preoperative data and shows the positions of tracked instruments with respect to these views.

Publications

Azimi, Ehsan; Niu, Zhiyuan; Stiber, Maia; Greene, Nicholas; Liu, Ruby; Molina, Camilo; Huang, Judy; Huang, Chien-Ming; Kazanzides, Peter

An Interactive Mixed Reality Platform for Bedside Surgical Procedures Proceedings Article

In: Medical Image Computing and Computer-Assisted Intervention (MICCAI), pp. 65-75, 2020.

BibTeX

Azimi, Ehsan; Liu, Ruby; Molina, Camilo; Huang, Judy; Kazanzides, Peter

Interactive Navigation System in Mixed-Reality for Neurosurgery Proceedings Article

In: IEEE Virtual Reality (VR), pp. 783-784, 2020.

BibTeX

Azimi, Ehsan; Molina, Camilo; Chang, Alexander; Huang, Judy; Huang, Chien-Ming; Kazanzides, Peter

Interactive Training and Operation Ecosystem for Surgical Tasks in Mixed Reality Proceedings Article

In: MICCAI Workshop on OR 2.0 Context-Aware Operating Theaters, pp. 20-29, 2018.

BibTeX

Sadda, Praneeth; Azimi, Ehsan; Jallo, George; Doswell, Jayfus; Kazanzides, Peter

Surgical navigation with a head-mounted tracking system and display Proceedings Article

In: Medicine Meets Virtual Reality (MMVR), San Diego, CA, 2013.

BibTeX

Azimi, Ehsan; Doswell, Jayfus; Kazanzides, Peter

Augmented reality goggles with an integrated tracking system for navigation in neurosurgery Proceedings Article

In: IEEE Virtual Reality, pp. 123-124, Orange County, CA, 2012.

BibTeX

Press Coverage

JHU Hub Article, ‘Mixed Reality’ Makes for Better Surgeons, May 17, 2019

WUSA, Channel 9 (Washington DC), TV News Story, July 31, 2019