Bio


Grant Rotskoff studies the nonequilibrium dynamics of living matter with a particular focus on self-organization from the molecular to the cellular scale. His work involves developing theoretical and computational tools that can probe and predict the properties of physical systems driven away from equilibrium. Recently, he has focused on characterizing and designing physically accurate machine learning techniques for biophysical modeling. Prior to his current position, Grant was a James S. McDonnell Fellow working at the Courant Institute of Mathematical Sciences at New York University. He completed his Ph.D. at the University of California, Berkeley in the Biophysics graduate group supported by an NSF Graduate Research Fellowship. His thesis, which was advised by Phillip Geissler and Gavin Crooks, developed theoretical tools for understanding nonequilibrium control of the small, fluctuating systems, such as those encountered in molecular biophysics. He also worked on coarsegrained models of the hydrophobic effect and self-assembly. Grant received an S.B. in Mathematics from the University of Chicago, where he became interested in biophysics as an undergraduate while working on free energy methods for large-scale molecular dynamics simulations.

Research Summary

My research focuses on theoretical and computational approaches to "mesoscale" biophysics. Many of the cellular phenomena that we consider the hallmarks of living systems occur at the scale of hundreds or thousands of proteins. Processes like the self-assembly of organelle-sized structures, the dynamics of cell division, and the transduction of signals from the environment to the machinery of the cell are not macroscopic phenomena—they are the result of a fluctuating, nonequilibrium dynamics. Experimentally probing mesoscale systems remains extremely difficult, though it is continuing to benefit from advances in cryo-electron microscopy and super-resolution imaging, among many other techniques. Predictive and explanatory models that resolve the essential physics at these intermediate scales have the power to both aid and enrich the understanding we are presently deriving from these experimental developments.

Major parts of my research include:

1. Dynamics of mesoscale biophysical assembly and response.— Biophysical processes involve chemical gradients and time-dependent external signals. These inherently nonequilibrium stimuli drive supermolecular organization within the cell. We develop models of active assembly processes and protein-membrane interactions as a foundation for the broad goal of characterizing the properties of nonequilibrium biomaterials.

2. Machine learning and dimensionality reduction for physical models.— Machine learning techniques are rapidly becoming a central statistical tool in all domains of scientific research. We apply machine learning techniques to sampling problems that arise in computational chemistry and develop approaches for systematically coarse-graining physical models. Recently, we have also been exploring reinforcement learning in the context of nonequilibrium control problems.

3. Methods for nonequilibrium simulation, optimization, and control.— We lack well-established theoretical frameworks for describing nonequilibrium states, even seemingly simple situations in which there are chemical or thermal gradients. Additionally, there are limited tools for predicting the response of nonequilibrium systems to external perturbations, even when the perturbations are small. Both of these problems pose key technical challenges for a theory of active biomaterials. We work on optimal control, nonequilibrium statistical mechanics, and simulation methodology, with a particular interest in developing techniques for importance sampling configurations from nonequilibrium ensembles.

Academic Appointments


Honors & Awards


  • Early Career Research Program Award, Department of Energy (2022-2027)
  • Research Scholar Award, Google (2022)
  • Terman Faculty Fellow, Stanford University (2020-2022)

2023-24 Courses


Stanford Advisees


All Publications


  • Unified, Geometric Framework for Nonequilibrium Protocol Optimization. Physical review letters Chennakesavalu, S., Rotskoff, G. M. 2023; 130 (10): 107101

    Abstract

    Controlling thermodynamic cycles to minimize the dissipated heat is a long-standing goal in thermodynamics, and more recently, a central challenge in stochastic thermodynamics for nanoscale systems. Here, we introduce a theoretical and computational framework for optimizing nonequilibrium control protocols that can transform a system between two distributions in a minimally dissipative fashion. These protocols optimally transport a system along paths through the space of probability distributions that minimize the dissipative cost of a transformation. Furthermore, we show that the thermodynamic metric-determined via a linear response approach-can be directly derived from the same objective function that is optimized in the optimal transport problem, thus providing a unified perspective on thermodynamic geometries. We investigate this unified geometric framework in two model systems and observe that our procedure for optimizing control protocols is robust beyond linear response.

    View details for DOI 10.1103/PhysRevLett.130.107101

    View details for PubMedID 36962015

  • Trainability and Accuracy of Artificial Neural Networks: An Interacting Particle System Approach COMMUNICATIONS ON PURE AND APPLIED MATHEMATICS Rotskoff, G. M., Vanden-Eijnden, E. 2022; 75 (9): 1889-1935

    View details for DOI 10.1002/cpa.22074

    View details for Web of Science ID 000828449900002

  • Physics-informed graph neural networks enhance scalability of variational nonequilibrium optimal control JOURNAL OF CHEMICAL PHYSICS Yan, J., Rotskoff, G. M. 2022; 157 (7): 074101

    Abstract

    When a physical system is driven away from equilibrium, the statistical distribution of its dynamical trajectories informs many of its physical properties. Characterizing the nature of the distribution of dynamical observables, such as a current or entropy production rate, has become a central problem in nonequilibrium statistical mechanics. Asymptotically, for a broad class of observables, the distribution of a given observable satisfies a large deviation principle when the dynamics is Markovian, meaning that fluctuations can be characterized in the long-time limit by computing a scaled cumulant generating function. Calculating this function is not tractable analytically (nor often numerically) for complex, interacting systems, so the development of robust numerical techniques to carry out this computation is needed to probe the properties of nonequilibrium materials. Here, we describe an algorithm that recasts this task as an optimal control problem that can be solved variationally. We solve for optimal control forces using neural network ansatz that are tailored to the physical systems to which the forces are applied. We demonstrate that this approach leads to transferable and accurate solutions in two systems featuring large numbers of interacting particles.

    View details for DOI 10.1063/5.0095593

    View details for Web of Science ID 000840971900002

    View details for PubMedID 35987599

  • Adaptive Monte Carlo augmented with normalizing flows. Proceedings of the National Academy of Sciences of the United States of America Gabrie, M., Rotskoff, G. M., Vanden-Eijnden, E. 2022; 119 (10): e2109420119

    Abstract

    SignificanceMonte Carlo methods, tools for sampling data from probability distributions, are widely used in the physical sciences, applied mathematics, and Bayesian statistics. Nevertheless, there are many situations in which it is computationally prohibitive to use Monte Carlo due to slow "mixing" between modes of a distribution unless hand-tuned algorithms are used to accelerate the scheme. Machine learning techniques based on generative models offer a compelling alternative to the challenge of designing efficient schemes for a specific system. Here, we formalize Monte Carlo augmented with normalizing flows and show that, with limited prior data and a physically inspired algorithm, we can substantially accelerate sampling with generative models.

    View details for DOI 10.1073/pnas.2109420119

    View details for PubMedID 35235453

  • Learning nonequilibrium control forces to characterize dynamical phase transitions PHYSICAL REVIEW E Yan, J., Touchette, H., Rotskoff, G. M. 2022; 105 (2)
  • Learning nonequilibrium control forces to characterize dynamical phase transitions. Physical review. E Yan, J., Touchette, H., Rotskoff, G. M. 2022; 105 (2-1): 024115

    Abstract

    Sampling the collective, dynamical fluctuations that lead to nonequilibrium pattern formation requires probing rare regions of trajectory space. Recent approaches to this problem, based on importance sampling, cloning, and spectral approximations, have yielded significant insight into nonequilibrium systems but tend to scale poorly with the size of the system, especially near dynamical phase transitions. Here we propose a machine learning algorithm that samples rare trajectories and estimates the associated large deviation functions using a many-body control force by leveraging the flexible function representation provided by deep neural networks, importance sampling in trajectory space, and stochastic optimal control theory. We show that this approach scales to hundreds of interacting particles and remains robust at dynamical phase transitions.

    View details for DOI 10.1103/PhysRevE.105.024115

    View details for PubMedID 35291069

  • Remembering the Work of Phillip L. Geissler: A Coda to His Scientific Trajectory. Annual review of physical chemistry Bowman, G. R., Cox, S. J., Dellago, C., DuBay, K. H., Eaves, J. D., Fletcher, D. A., Frechette, L. B., Grünwald, M., Klymko, K., Ku, J., Omar, A., Rabani, E., Reichman, D. R., Rogers, J. R., Rosnik, A. M., Rotskoff, G. M., Schneider, A. R., Schwierz, N., Sivak, D. A., Vaikuntanathan, S., Whitelam, S., Widmer-Cooper, A. 2022

    Abstract

    Phillip L. Geissler made important contributions to the statistical mechanics of biological polymers, heterogeneous materials, and chemical dynamics in aqueous environments. He devised analytical and computational methods that revealed the underlying organization of complex systems at the frontiers of biology, chemistry, and materials science. In this retrospective we celebrate his work at these frontiers. Expected final online publication date for the Annual Review of Physical Chemistry, Volume 74 is April 2023. Please see http://www.annualreviews.org/page/journal/pubdates for revised estimates.

    View details for DOI 10.1146/annurev-physchem-101422-030127

    View details for PubMedID 36719975

  • Probing the theoretical and computational limits of dissipative design. The Journal of chemical physics Chennakesavalu, S., Rotskoff, G. M. 2021; 155 (19): 194114

    Abstract

    Self-assembly, the process by which interacting components form well-defined and often intricate structures, is typically thought of as a spontaneous process arising from equilibrium dynamics. When a system is driven by external nonequilibrium forces, states statistically inaccessible to the equilibrium dynamics can arise, a process sometimes termed direct self-assembly. However, if we fix a given target state and a set of external control variables, it is not well-understood (i) how to designa protocol to drive the system toward the desired state nor (ii) the cost of persistently perturbing the stationary distribution. In this work, we derive a bound that relates the proximity to the chosen target with the dissipation associated with the external drive, showing that high-dimensional external control can guide systems toward target distribution but with an inevitable cost. Remarkably, the bound holds arbitrarily far from equilibrium. Second, we investigate the performance of deep reinforcement learning algorithms and provide evidence for the realizability of complex protocols that stabilize otherwise inaccessible states of matter.

    View details for DOI 10.1063/5.0067695

    View details for PubMedID 34800948

  • A Dynamical Central Limit Theorem for Shallow Neural Networks Chen, Z., Rotskoff, G. M., Bruna, J., Vanden-Eijnden, E., Larochelle, H., Ranzato, M., Hadsell, R., Balcan, M. F., Lin, H. NEURAL INFORMATION PROCESSING SYSTEMS (NIPS). 2020