All Publications


  • Automated Radiomic Analysis of Vestibular Schwannomas and Inner Ears Using Contrast-Enhanced T1-Weighted and T2-Weighted Magnetic Resonance Imaging Sequences and Artificial Intelligence. Otology & neurotology : official publication of the American Otological Society, American Neurotology Society [and] European Academy of Otology and Neurotology Neves, C. A., Liu, G. S., El Chemaly, T., Bernstein, I. A., Fu, F., Blevins, N. H. 2023

    Abstract

    To objectively evaluate vestibular schwannomas (VSs) and their spatial relationships with the ipsilateral inner ear (IE) in magnetic resonance imaging (MRI) using deep learning.Cross-sectional study.A total of 490 adults with VS, high-resolution MRI scans, and no previous neurotologic surgery.MRI studies of VS patients were split into training (390 patients) and test (100 patients) sets. A three-dimensional convolutional neural network model was trained to segment VS and IE structures using contrast-enhanced T1-weighted and T2-weighted sequences, respectively. Manual segmentations were used as ground truths. Model performance was evaluated on the test set and on an external set of 100 VS patients from a public data set (Vestibular-Schwannoma-SEG).Dice score, relative volume error, average symmetric surface distance, 95th-percentile Hausdorff distance, and centroid locations.Dice scores for VS and IE volume segmentations were 0.91 and 0.90, respectively. On the public data set, the model segmented VS tumors with a Dice score of 0.89 ± 0.06 (mean ± standard deviation), relative volume error of 9.8 ± 9.6%, average symmetric surface distance of 0.31 ± 0.22 mm, and 95th-percentile Hausdorff distance of 1.26 ± 0.76 mm. Predicted VS segmentations overlapped with ground truth segmentations in all test subjects. Mean errors of predicted VS volume, VS centroid location, and IE centroid location were 0.05 cm3, 0.52 mm, and 0.85 mm, respectively.A deep learning system can segment VS and IE structures in high-resolution MRI scans with excellent accuracy. This technology offers promise to improve the clinical workflow for assessing VS radiomics and enhance the management of VS patients.

    View details for DOI 10.1097/MAO.0000000000003959

    View details for PubMedID 37464458

  • Stereoscopic calibration for augmented reality visualization in microscopic surgery. International journal of computer assisted radiology and surgery El Chemaly, T., Athayde Neves, C., Leuze, C., Hargreaves, B., H Blevins, N. 2023

    Abstract

    Middle and inner ear procedures target hearing loss, infections, and tumors of the temporal bone and lateral skull base. Despite the advances in surgical techniques, these procedures remain challenging due to limited haptic and visual feedback. Augmented reality (AR) may improve operative safety by allowing the 3D visualization of anatomical structures from preoperative computed tomography (CT) scans on real intraoperative microscope video feed. The purpose of this work was to develop a real-time CT-augmented stereo microscope system using camera calibration and electromagnetic (EM) tracking.A 3D printed and electromagnetically tracked calibration board was used to compute the intrinsic and extrinsic parameters of the surgical stereo microscope. These parameters were used to establish a transformation between the EM tracker coordinate system and the stereo microscope image space such that any tracked 3D point can be projected onto the left and right images of the microscope video stream. This allowed the augmentation of the microscope feed of a 3D printed temporal bone with its corresponding CT-derived virtual model. Finally, the calibration board was also used for evaluating the accuracy of the calibration.We evaluated the accuracy of the system by calculating the registration error (RE) in 2D and 3D in a microsurgical laboratory setting. Our calibration workflow achieved a RE of 0.11 ± 0.06 mm in 2D and 0.98 ± 0.13 mm in 3D. In addition, we overlaid a 3D CT model on the microscope feed of a 3D resin printed model of a segmented temporal bone. The system exhibited small latency and good registration accuracy.We present the calibration of an electromagnetically tracked surgical stereo microscope for augmented reality visualization. The calibration method achieved accuracy within a range suitable for otologic procedures. The AR process introduces enhanced visualization of the surgical field while allowing depth perception.

    View details for DOI 10.1007/s11548-023-02980-5

    View details for PubMedID 37450175

    View details for PubMedCentralID 4634572

  • The user experience design of a novel microscope within SurgiSim, a virtual reality surgical simulator. International journal of computer assisted radiology and surgery de Lotbiniere-Bassett, M., Volpato Batista, A., Lai, C., El Chemaly, T., Dort, J., Blevins, N., Lui, J. 2022

    Abstract

    PURPOSE: Virtual reality (VR) simulation has the potential to advance surgical education, procedural planning, and intraoperative guidance. "SurgiSim" is a VR platform developed for the rehearsal of complex procedures using patient-specific anatomy, high-fidelity stereoscopic graphics, and haptic feedback. SurgiSim is the first VR simulator to include a virtual operating room microscope. We describe the process of designing and refining the VR microscope user experience (UX) and user interaction (UI) to optimize surgical rehearsal and education.METHODS: Human-centered VR design principles were applied in the design of the SurgiSim microscope to optimize the user's sense of presence. Throughout the UX's development, the team of developers met regularly with surgeons to gather end-user feedback. Supplemental testing was performed on four participants.RESULTS: Through observation and participant feedback, we made iterative design upgrades to the SurgiSim platform. We identified the following key characteristics of the VR microscope UI: overall appearance, hand controller interface, and microscope movement.CONCLUSION: Our design process identified challenges arising from the disparity between VR and physical environments that pertain to microscope education and deployment. These roadblocks were addressed using creative solutions. Future studies will investigate the efficacy of VR surgical microscope training on real-world microscope skills as assessed by validated performance metrics.

    View details for DOI 10.1007/s11548-022-02727-8

    View details for PubMedID 35933491