Bio


I am an MD candidate with a background in physics, math, computer science, and engineering, which I apply to the quantitative study of complex biological systems.

My research interests are in the development of novel mathematical and computational methods for interpreting and understanding biomedical data. Few things make my ears perk up as much as variants of the phrase, "we have all this data, but aren't quite sure what to do with it."

Previously I've worked in such areas as machine learning for neuro-imaging (with Olivier Gevaert at Stanford), theoretical biophysics (with Shamit Kachru at Stanford), algorithms for computational neuroscience (with Ralf Wessel and Benjamin Moseley at WUStL), inverse optimization for quantum computing (with Alejandro Rodriguez at Princeton), and computational molecular dynamics for drug design (with Matt Jacobson at UCSF).

From a clinical perspective I am particularly interested in neurological disorders, and this is reflected in a significant fraction of my research as well. I am additionally interested in medical ethics, and mathematics education within the medical community and the biosciences.

Honors & Awards


  • Graduate Scholar in Residence, El Centro Chicano y Latino, Stanford University (2020)
  • Tau Beta Pi Membership, Tau Beta Pi Honor Society (2017)
  • ADVANCE Fellow, Stanford University (2018)
  • Boeing Scholarship, The Boeing Company (2017)
  • Harold P. Brown Engineering Fellowship, Washington University in St. Louis McKelvey School of Engineering (2016)
  • Goldwater Scholarship Honorable Mention, Barry Goldwater Scholarship Foundation (2016)

Education & Certifications


  • M.S., Stanford University, Biophysics (2021)
  • B.S., Washington University in St. Louis, Systems Engineering (2018)
  • B.S., Washington University in St. Louis, Computer Science (2018)
  • B.S., WUStL/Whitworth dual-degree program, Biophysics (2018)
  • Master of Science, Stanford University, BIOPH-MS (2021)

Current Research and Scholarly Interests


My research interests are in the development of novel quantitative approaches for tackling medical problems, including algorithms, machine learning techniques, methods of interpreting complex data, and mathematical frameworks for improving our understanding of biological processes relevant to disease. These approaches are linked by a common mathematical toolkit, and I aim to develop both foundational and applied solutions to quantitative problems in medicine. I am especially interested in problems relevant to neurological disease. Few things make my ears perk up as much as variants of the phrase, "we have all this data, but aren't quite sure what to do with it."

As I wrap up some projects in machine learning for neuroimaging, my current work is at the intersections of complex systems science, neuroscience, and transcriptomics. Publications from the former should appear later in 2022.

In the past I've mentored a number of undergraduates in areas like computational neuroscience, the application of theoretical physics methods to ecology and evolution, and data analysis for other subjects. I have an ever-growing list of project ideas which I consider very interesting but can't get to, and am happy to discuss potential advising on them. Good candidates would be any undergraduates here at Stanford who have an interest in biomedical medical research and a background in EE, CS, math, physics, or similarly quantitative subjects.

Beyond my research, I am currently in the process of writing my first textbook, intended to make applied mathematics intuitive and approachable for those in medicine and bioscience whose math background is limited, and McGraw-Hill will be publishing it - likely in mid to late 2023. I will additionally be teaching a 2-course sequence on this material for a broad audience in Fall and Winter quarters of the 2022-23 academic year.

Lab Affiliations


All Publications


  • Pre-Synaptic Pool Modification (PSPM): A supervised learning procedure for recurrent spiking neural networks. PloS one Bagley, B. A., Bordelon, B., Moseley, B., Wessel, R. 2020; 15 (2): e0229083

    Abstract

    Learning synaptic weights of spiking neural network (SNN) models that can reproduce target spike trains from provided neural firing data is a central problem in computational neuroscience and spike-based computing. The discovery of the optimal weight values can be posed as a supervised learning task wherein the weights of the model network are chosen to maximize the similarity between the target spike trains and the model outputs. It is still largely unknown whether optimizing spike train similarity of highly recurrent SNNs produces weight matrices similar to those of the ground truth model. To this end, we propose flexible heuristic supervised learning rules, termed Pre-Synaptic Pool Modification (PSPM), that rely on stochastic weight updates in order to produce spikes within a short window of the desired times and eliminate spikes outside of this window. PSPM improves spike train similarity for all-to-all SNNs and makes no assumption about the post-synaptic potential of the neurons or the structure of the network since no gradients are required. We test whether optimizing for spike train similarity entails the discovery of accurate weights and explore the relative contributions of local and homeostatic weight updates. Although PSPM improves similarity between spike trains, the learned weights often differ from the weights of the ground truth model, implying that connectome inference from spike data may require additional constraints on connectivity statistics. We also find that spike train similarity is sensitive to local updates, but other measures of network activity such as avalanche distributions, can be learned through synaptic homeostasis.

    View details for DOI 10.1371/journal.pone.0229083

    View details for PubMedID 32092107

  • A Heuristic Approach to Spiking Neural Networks Washington University Office of Undergraduate Research Digest Bagley, B. A., Bordelon, B. 2018; 13 (11)
  • Inverse Design of Optimal Nonlinear Photonic Structures MIRTHE Research Conference Bagley, B. A., Lin, Z., Rodriguez, A. 2016
  • A Theoretical Study of Binding Dynamics in TLX NR2E1 Ligand-Binding Domain UCSF SRTP Research Symposium Bagley, B. A., Jacobson, M. 2015
  • Increasing the Pepsin Resistance of a Prolyl Endopeptidase: Lessons Learned in the Prediction of Mutation Sites Spokane Intercollegiate Research Conference Klick, M., Arnold, M., Dodge, A., Miles, J., Cooper, S., Bagley, B. A., Jones, K. 2014