Bio


Caroline Trippel is an Assistant Professor in the Computer Science and Electrical Engineering Departments at Stanford University working in the area of computer architecture. Prior to starting at Stanford, Trippel spent nine months as a Research Scientist at Facebook in the FAIR SysML group. Her work focuses on promoting correctness and security as first-order computer systems design metrics (akin to performance and power). A central theme of her work is leveraging formal methods techniques to design and verify hardware systems in order to ensure that they can provide correctness and security guarantees for the applications they intend to support. Additionally, Trippel has been recently exploring the role of architecture in enabling privacy-preserving machine learning, the role of machine learning in hardware systems optimizations, particularly in the context of neural recommendation, and opportunities for improving datacenter and at-scale machine learning reliability.

Trippel's research has influenced the design of the RISC-V ISA memory consistency model both via her formal analysis of its draft specification and her subsequent participation in the RISC-V Memory Model Task Group. Additionally, her work produced a novel methodology and tool that synthesized two new variants of the now-famous Meltdown and Spectre attacks.

Trippel's research has been recognized with IEEE Top Picks distinctions, the 2020 ACM SIGARCH/IEEE CS TCCA Outstanding Dissertation Award, and the 2020 CGS/ProQuest® Distinguished Dissertation Award in Mathematics, Physical Sciences, & Engineering. She was also awarded an NVIDIA Graduate Fellowship (2017-2018) and selected to attend the 2018 MIT Rising Stars in EECS Workshop. Trippel completed her PhD in Computer Science at Princeton University and her BS in Computer Engineering at Purdue University.

Academic Appointments


2024-25 Courses


Stanford Advisees


All Publications