
Brooke Krajancich
Ph.D. Student in Electrical Engineering, admitted Autumn 2018
Honors & Awards
-
Knight-Hennessy Fellowship, Stanford University (2018)
Education & Certifications
-
BPhil (hons), University of Western Australia, Electrical engineering & mathematics (2017)
All Publications
-
Towards Retina-Quality VR Video Streaming: 15 ms Could Save You 80% of Your Bandwidth
ACM SIGCOMM COMPUTER COMMUNICATION REVIEW
2022; 52 (1): 11-19
View details for Web of Science ID 000763859300002
-
A Perceptual Model for Eccentricity-dependent Spatio-temporal Flicker Fusion and its Applications to Foveated Graphics
ACM TRANSACTIONS ON GRAPHICS
2021; 40 (4)
View details for DOI 10.1145/3450626.3459784
View details for Web of Science ID 000674930900014
-
Optimizing Depth Perception in Virtual and Augmented Reality through Gaze-contingent Stereo Rendering
ACM TRANSACTIONS ON GRAPHICS
2020; 39 (6)
View details for DOI 10.1145/3414685.3417820
View details for Web of Science ID 000595589100109
-
Factored Occlusion: Single Spatial Light Modulator Occlusion-capable Optical See-through Augmented Reality Display
IEEE COMPUTER SOC. 2020: 1871–79
Abstract
Occlusion is a powerful visual cue that is crucial for depth perception and realism in optical see-through augmented reality (OST-AR). However, existing OST-AR systems additively overlay physical and digital content with beam combiners - an approach that does not easily support mutual occlusion, resulting in virtual objects that appear semi-transparent and unrealistic. In this work, we propose a new type of occlusion-capable OST-AR system. Rather than additively combining the real and virtual worlds, we employ a single digital micromirror device (DMD) to merge the respective light paths in a multiplicative manner. This unique approach allows us to simultaneously block light incident from the physical scene on a pixel-by-pixel basis while also modulating the light emitted by a light-emitting diode (LED) to display digital content. Our technique builds on mixed binary/continuous factorization algorithms to optimize time-multiplexed binary DMD patterns and their corresponding LED colors to approximate a target augmented reality (AR) scene. In simulations and with a prototype benchtop display, we demonstrate hard-edge occlusions, plausible shadows, and also gaze-contingent optimization of this novel display mode, which only requires a single spatial light modulator.
View details for DOI 10.1109/TVCG.2020.2973443
View details for Web of Science ID 000523746000006
View details for PubMedID 32070978
-
Handheld volumetric manual compression-based quantitative micro-elastography.
Journal of biophotonics
2020
Abstract
Compression optical coherence elastography typically requires a mechanical actuator to impart a controlled uniform strain to the sample. However, for handheld scanning, this adds complexity to the design of the probe and the actuator stroke limits the amount of strain that can be applied. In this work, we present a new volumetric imaging approach that utilises bidirectional manual compression via the natural motion of the user's hand to induce strain to the sample, realising compact, actuator-free, handheld compression optical coherence elastography. In this way, we are able to demonstrate rapid acquisition of three-dimensional quantitative micro-elastography (QME) datasets of a tissue volume (6 *6 *1mm) in 3.4 seconds. We characterise the elasticity sensitivity of this freehand manual compression approach using a homogeneous silicone phantom and demonstrate comparable performance to a bench-top mounted, actuator-based approach. In addition, we demonstrate handheld volumetric manual compression-based QME on a tissue-mimicking phantom with an embedded stiff inclusion and on freshly excised human breast specimens from both mastectomy and wide local excision surgeries. Tissue results are co-registered with post-operative histology, verifying the capability of our approach to measure the elasticity of tissue and to distinguish stiff tumor from surrounding soft benign tissue. This article is protected by copyright. All rights reserved.
View details for DOI 10.1002/jbio.201960196
View details for PubMedID 32057188
-
A Patient-Specific Mixed-Reality Visualization Tool for Thoracic Surgical Planning.
The Annals of thoracic surgery
2020
Abstract
Identifying small lung lesions during minimally invasive thoracic surgery can be challenging. We describe 3D mixed-reality visualization technology that may facilitate non-invasive nodule localization.A software application and medical image processing pipeline were developed for the Microsoft HoloLens to incorporate patient-specific data and provide a mixed-reality tool to explore and manipulate chest anatomy with a custom-designed user interface featuring gesture and voice recognition.A needs assessment between engineering and clinical disciplines identified the potential utility of mixed-reality technology in facilitating safe and effective resection of small lung nodules. Through an iterative process, we developed a prototype employing a wearable headset that allows the user to: (1) view a patient's original preoperative imaging, (2) manipulate a 3D rendering of that patient's chest anatomy including the bronchial, osseus, and vascular structures, and (3) simulate lung deflation and surgical instrument placement.Mixed-reality visualization during surgical planning may facilitate accurate and rapid identification of small lung lesions during minimally invasive surgeries and reduce the need for additional invasive pre-operative localization procedures.
View details for DOI 10.1016/j.athoracsur.2020.01.060
View details for PubMedID 32145195
-
Handheld probe for quantitative micro-elastography.
Biomedical optics express
2019; 10 (8): 4034-4049
Abstract
Optical coherence elastography (OCE) has been proposed for a range of clinical applications. However, the majority of these studies have been performed using bulky, lab-based imaging systems. A compact, handheld imaging probe would accelerate clinical translation, however, to date, this had been inhibited by the slow scan rates of compact devices and the motion artifact induced by the user's hand. In this paper, we present a proof-of-concept, handheld quantitative micro-elastography (QME) probe capable of scanning a 6 × 6 × 1 mm volume of tissue in 3.4 seconds. This handheld probe is enabled by a novel QME acquisition protocol that incorporates a custom bidirectional scan pattern driving a microelectromechanical system (MEMS) scanner, synchronized with the sample deformation induced by an annular PZT actuator. The custom scan pattern reduces the total acquisition time and the time difference between B-scans used to generate displacement maps, minimizing the impact of motion artifact. We test the feasibility of the handheld QME probe on a tissue-mimicking silicone phantom, demonstrating comparable image quality to a bench-mounted setup. In addition, we present the first handheld QME scans performed on human breast tissue specimens. For each specimen, quantitative micro-elastograms are co-registered with, and validated by, histology, demonstrating the ability to distinguish stiff cancerous tissue from surrounding soft benign tissue.
View details for DOI 10.1364/BOE.10.004034
View details for PubMedID 31452993
View details for PubMedCentralID PMC6701559
-
Handheld optical palpation of turbid tissue with motion-artifact correction
BIOMEDICAL OPTICS EXPRESS
2019; 10 (1): 226–41
View details for DOI 10.1364/BOE.10.000226
View details for Web of Science ID 000454173400018