Research by Johns Hopkins University computer scientists makes up one-sixth of the total accepted publications and abstracts at the 2024 International Conference on Information Processing in Computer-Assisted Interventions to be held June 18 and 19 in Barcelona, Spain. IPCAI is an important venue for disseminating innovative peer-reviewed research in computer-assisted surgery and minimally invasive interventions, attracting clinicians, engineers, and computer science researchers from various backgrounds, including machine learning, robotics, computer vision, medical imaging, data science, and sensing technologies.
Two papers from the Advanced Robotics and Computationally AugmenteD Environments (ARCADE) Lab—led by Mathias Unberath, the John C. Malone Assistant Professor of Computer Science as of July 1—are finalists for Best Paper Awards: “Neural Digital Twins: Reconstructing Complex Medical Environments for Spatial Planning in Virtual Reality” and “Stand in Surgeon’s Shoes: Swapping Roles in Virtual Reality to Enhance Teamwork in Surgery.”
The first paper leverages Neuralangelo, a TIME Best Invention of 2023, in constructing digital twins of operating rooms to support immersive space planning applications. Advance planning and positioning is essential for streamlining surgical processes and enhancing patient safety, operating efficiency, and communication—especially when it comes to complex procedures and increasingly cluttered environments such as those in the operating room, the researchers say. To this end, members of the ARCADE Lab reconstructed highly realistic “digital twins” of two operating rooms using Neuralangelo and conducted user studies on planning tasks in the resulting virtual reality environments. Their findings indicate that future surgical planning tools may benefit from enabling users to supply their own video material to facilitate a more immersive and comprehensive planning process; the highly realistic simulation increases user engagement and sense of presence as compared to simpler, conventional virtual reality environments.
The lab’s second paper investigates whether cross-training in teammates’ roles can enhance their sense of teamwork. The researchers evaluated a new training curriculum in which a surgeon and radiologist traded roles in a risk-free virtual reality operating room and demonstrated that, for novice non-surgeon participants, prior exposure to the surgeon’s task increased their engagement in a pretend surgery and reduced their frustration when acting in a supporting role.
The ARCADE Lab is also presenting “Take a Shot! Natural Language Control of Intelligent Robotic X-ray Systems in Surgery,” which introduces an English language voice interface for controlling an advanced robotic X-ray imaging system so that surgeons can use the system hands-free and without having to pause any procedures. Using the researchers’ interface, a surgeon can specify the image they want the system to take and, using a large language model to transcribe the spoken command and ask for any needed clarification, it will automatically provide the desired image.
The ARCADE Lab has three other full papers at IPCAI:
- “OneSLAM to Map Them All: A Universal Approach to SLAM for Endoscopic Imaging,” which presents a method for surgical camera tracking and 3D reconstruction that works for various endoscopic procedures;
- “Cognitive effort detection for telerobotic surgery via personalized pupil response modeling,” which introduces a method to measure the cognitive load of a user based on eye gaze tracking and pupillometry, or the measurement of pupil size and reactivity; and
- “An Endoscopic Chisel: Intraoperative Imaging Carves 3D Anatomical Models,” which demonstrates a vision-based method of updating preoperative anatomical models during destructive sinus surgeries.
In addition to these five full papers, the lab will be presenting two late-breaking results in the form of long abstract presentations:
- “Cognitive load in telerobotic surgery: A comparison of eye tracker designs” by Roger D. Soberanis-Mukul, Paola Ruiz Puentes, Ayberk Acar, Iris Gupta, Joyraj Bhowmick, Yizhou Li, Ahmed Ghazi, Jie Ying Wu, and Mathias Unberath; and
- “Near-Infrared Beacons: Tracking Anatomy with Bio-Compatible Fluorescent Dots for Mixed Reality Surgical Navigation” by Wenhao Gu, Justin Opfermann, Jonathan Knopf, Axel Krieger, and Mathias Unberath
Other Johns Hopkins-affiliated research to be presented at IPCAI includes:
- “Beyond the Manual Touch: Situational-Aware Force Control for Increased Safety in Robot-Assisted Skull Base Surgery” by Hisashi Ishida, Deepa Galaiya, Nimesh Nagururu, Francis Creighton, Peter Kazanzides, Russell Taylor, and Manish Sahu
- “Enhancing robotic telesurgery with sensorless haptic feedback” by Nural Yilmaz, Brendan Burkhart, Anton Deguet, Peter Kazanzides, and Ugur Tumerdem