Grant M. Rotskoff
Assistant Professor of Chemistry
Bio
Grant Rotskoff studies the nonequilibrium dynamics of living matter with a particular focus on self-organization from the molecular to the cellular scale. His work involves developing theoretical and computational tools that can probe and predict the properties of physical systems driven away from equilibrium. Recently, he has focused on characterizing and designing physically accurate machine learning techniques for biophysical modeling. Prior to his current position, Grant was a James S. McDonnell Fellow working at the Courant Institute of Mathematical Sciences at New York University. He completed his Ph.D. at the University of California, Berkeley in the Biophysics graduate group supported by an NSF Graduate Research Fellowship. His thesis, which was advised by Phillip Geissler and Gavin Crooks, developed theoretical tools for understanding nonequilibrium control of the small, fluctuating systems, such as those encountered in molecular biophysics. He also worked on coarsegrained models of the hydrophobic effect and self-assembly. Grant received an S.B. in Mathematics from the University of Chicago, where he became interested in biophysics as an undergraduate while working on free energy methods for large-scale molecular dynamics simulations.
Research Summary
My research focuses on theoretical and computational approaches to "mesoscale" biophysics. Many of the cellular phenomena that we consider the hallmarks of living systems occur at the scale of hundreds or thousands of proteins. Processes like the self-assembly of organelle-sized structures, the dynamics of cell division, and the transduction of signals from the environment to the machinery of the cell are not macroscopic phenomena—they are the result of a fluctuating, nonequilibrium dynamics. Experimentally probing mesoscale systems remains extremely difficult, though it is continuing to benefit from advances in cryo-electron microscopy and super-resolution imaging, among many other techniques. Predictive and explanatory models that resolve the essential physics at these intermediate scales have the power to both aid and enrich the understanding we are presently deriving from these experimental developments.
Major parts of my research include:
1. Dynamics of mesoscale biophysical assembly and response.— Biophysical processes involve chemical gradients and time-dependent external signals. These inherently nonequilibrium stimuli drive supermolecular organization within the cell. We develop models of active assembly processes and protein-membrane interactions as a foundation for the broad goal of characterizing the properties of nonequilibrium biomaterials.
2. Machine learning and dimensionality reduction for physical models.— Machine learning techniques are rapidly becoming a central statistical tool in all domains of scientific research. We apply machine learning techniques to sampling problems that arise in computational chemistry and develop approaches for systematically coarse-graining physical models. Recently, we have also been exploring reinforcement learning in the context of nonequilibrium control problems.
3. Methods for nonequilibrium simulation, optimization, and control.— We lack well-established theoretical frameworks for describing nonequilibrium states, even seemingly simple situations in which there are chemical or thermal gradients. Additionally, there are limited tools for predicting the response of nonequilibrium systems to external perturbations, even when the perturbations are small. Both of these problems pose key technical challenges for a theory of active biomaterials. We work on optimal control, nonequilibrium statistical mechanics, and simulation methodology, with a particular interest in developing techniques for importance sampling configurations from nonequilibrium ensembles.
Academic Appointments
Honors & Awards
-
Early Career Research Program Award, Department of Energy (2022-2027)
-
Research Scholar Award, Google (2022)
-
Terman Faculty Fellow, Stanford University (2020-2022)
2024-25 Courses
- Advanced Physical Chemistry
CHEM 273 (Win) - Machine Learning for Chemical and Dynamical Data
CHEM 263 (Aut) - Physical Chemistry III
CHEM 175 (Win) -
Independent Studies (8)
- Advanced Undergraduate Research
CHEM 190 (Aut, Win, Spr, Sum) - Directed Instruction/Reading
CHEM 90 (Aut, Win, Spr, Sum) - Graduate Research
BIOPHYS 300 (Aut, Win, Spr, Sum) - Ph.D. Research
CME 400 (Aut, Win, Spr, Sum) - Research
PHYSICS 490 (Aut, Win, Spr, Sum) - Research and Special Advanced Work
CHEM 200 (Aut, Win, Spr, Sum) - Research in Chemistry
CHEM 301 (Aut, Win, Spr, Sum) - Senior Honors Thesis
MATH 197 (Spr)
- Advanced Undergraduate Research
-
Prior Year Courses
2023-24 Courses
- Advanced Physical Chemistry
CHEM 273 (Win) - Machine Learning for Chemical and Dynamical Data
CHEM 263 (Aut) - Physical Chemistry III
CHEM 175 (Win)
2022-23 Courses
- Advanced Physical Chemistry
CHEM 273 (Win) - Exploring Chemical Research at Stanford
CHEM 91 (Win) - Physical Chemistry III
CHEM 175 (Win)
2021-22 Courses
- Advanced Physical Chemistry
CHEM 273 (Win) - Machine Learning for Chemical and Dynamical Data
CHEM 263 (Aut) - Physical Chemistry III
CHEM 175 (Win)
- Advanced Physical Chemistry
Stanford Advisees
-
Doctoral Dissertation Reader (AC)
Xiao Cui, Ethan Curtis, Joseph Kelly -
Postdoctoral Faculty Sponsor
Clay Batton, Sreekanth Kizhakkumpurath Manikandan, Jeremie Klinger -
Doctoral Dissertation Advisor (AC)
Shriram Chennakesavalu, Steven Dunne, Sebastian Ibarraran, Sherry Li, Andy Mitchell, Abigail Park, Emmit Pert -
Doctoral Dissertation Co-Advisor (AC)
Yinuo Ren
All Publications
-
Committor Guided Estimates of Molecular Transition Rates.
Journal of chemical theory and computation
2024
Abstract
The probability that a configuration of a physical system reacts, or transitions from one metastable state to another, is quantified by the committor function. This function contains richly detailed mechanistic information about transition pathways, but a full parametrization of the committor requires the construction of a high-dimensional function, a generically challenging task. Recent efforts to leverage neural networks as a means to solve high-dimensional partial differential equations, often called "physics-informed" machine learning, have brought the committor into computational reach. Here, we build on the semigroup approach to learning the committor and assess its utility for predicting dynamical quantities such as transition rates. We show that a careful reframing of the objective function and improved adaptive sampling strategies provide highly accurate representations of the committor. Furthermore, by directly applying the Hill relation, we show that these committors provide accurate transition rates for molecular systems.
View details for DOI 10.1021/acs.jctc.4c00997
View details for PubMedID 39420582
-
Power dissipation and entropy production rate of high-dimensional optical matter systems
PHYSICAL REVIEW E
2024; 110 (4)
View details for DOI 10.1103/PhysRevE.110.044109
View details for Web of Science ID 001334401400006
-
Sampling thermodynamic ensembles of molecular systems with generative neural networks: Will integrating physics-based models close the generalization gap?
CURRENT OPINION IN SOLID STATE & MATERIALS SCIENCE
2024; 30
View details for DOI 10.1016/j.cossms.2024.101158
View details for Web of Science ID 001224273200001
-
Nanocrystal Assemblies: Current Advances and Open Problems.
ACS nano
2024
Abstract
We explore the potential of nanocrystals (a term used equivalently to nanoparticles) as building blocks for nanomaterials, and the current advances and open challenges for fundamental science developments and applications. Nanocrystal assemblies are inherently multiscale, and the generation of revolutionary material properties requires a precise understanding of the relationship between structure and function, the former being determined by classical effects and the latter often by quantum effects. With an emphasis on theory and computation, we discuss challenges that hamper current assembly strategies and to what extent nanocrystal assemblies represent thermodynamic equilibrium or kinetically trapped metastable states. We also examine dynamic effects and optimization of assembly protocols. Finally, we discuss promising material functions and examples of their realization with nanocrystal assemblies.
View details for DOI 10.1021/acsnano.3c10201
View details for PubMedID 38814908
-
Microscopic origin of tunable assembly forces in chiral active environments.
Soft matter
2024
Abstract
Across a variety of spatial scales, from nanoscale biological systems to micron-scale colloidal systems, equilibrium self-assembly is entirely dictated by-and therefore limited by-the thermodynamic properties of the constituent materials. In contrast, nonequilibrium materials, such as self-propelled active matter, expand the possibilities for driving the assemblies that are inaccessible in equilibrium conditions. Recently, a number of works have suggested that active matter drives or accelerates self-organization, but the emergent interactions that arise between solutes immersed in actively driven environments are complex and poorly understood. Here, we analyze and resolve two crucial questions concerning actively driven self-assembly: (i) how, mechanistically, do active environments drive self-assembly of passive solutes? (ii) Under which conditions is this assembly robust? We employ the framework of odd hydrodynamics to theoretically explain numerical and experimental observations that chiral active matter, i.e., particles driven with a directional torque, produces robust and long-ranged assembly forces. Together, these developments constitute an important step towards a comprehensive theoretical framework for controlling self-assembly in nonequilibrium environments.
View details for DOI 10.1039/d4sm00247d
View details for PubMedID 38726733
-
Data-Efficient Generation of Protein Conformational Ensembles with Backbone-to-Side-Chain Transformers.
The journal of physical chemistry. B
2024
Abstract
Excitement at the prospect of using data-driven generative models to sample configurational ensembles of biomolecular systems stems from the extraordinary success of these models on a diverse set of high-dimensional sampling tasks. Unlike image generation or even the closely related problem of protein structure prediction, there are currently no data sources with sufficient breadth to parametrize generative models for conformational ensembles. To enable discovery, a fundamentally different approach to building generative models is required: models should be able to propose rare, albeit physical, conformations that may not arise in even the largest data sets. Here we introduce a modular strategy to generate conformations based on "backmapping" from a fixed protein backbone that (1) maintains conformational diversity of the side chains and (2) couples the side-chain fluctuations using global information about the protein conformation. Our model combines simple statistical models of side-chain conformations based on rotamer libraries with the now ubiquitous transformer architecture to sample with atomistic accuracy. Together, these ingredients provide a strategy for rapid data acquisition and hence a crucial ingredient for scalable physical simulation with generative neural networks.
View details for DOI 10.1021/acs.jpcb.3c08195
View details for PubMedID 38394363
-
Adaptive nonequilibrium design of actin-based metamaterials: Fundamental and practical limits of control.
Proceedings of the National Academy of Sciences of the United States of America
2024; 121 (8): e2310238121
Abstract
The adaptive and surprising emergent properties of biological materials self-assembled in far-from-equilibrium environments serve as an inspiration for efforts to design nanomaterials. In particular, controlling the conditions of self-assembly can modulate material properties, but there is no systematic understanding of either how to parameterize external control or how controllable a given material can be. Here, we demonstrate that branched actin networks can be encoded with metamaterial properties by dynamically controlling the applied force under which they grow and that the protocols can be selected using multi-task reinforcement learning. These actin networks have tunable responses over a large dynamic range depending on the chosen external protocol, providing a pathway to encoding "memory" within these structures. Interestingly, we obtain a bound that relates the dissipation rate and the rate of "encoding" that gives insight into the constraints on control-both physical and information theoretical. Taken together, these results emphasize the utility and necessity of nonequilibrium control for designing self-assembled nanostructures.
View details for DOI 10.1073/pnas.2310238121
View details for PubMedID 38359294
-
Computing equilibrium free energies through a nonequilibrium quench.
The Journal of chemical physics
2024; 160 (3)
Abstract
Many methods to accelerate sampling of molecular configurations are based on the idea that temperature can be used to accelerate rare transitions. These methods typically compute equilibrium properties at a target temperature using reweighting or through Monte Carlo exchanges between replicas at higher temperatures. A recent paper [G. M. Rotskoff and E. Vanden-Eijnden, Phys. Rev. Lett. 122, 150602 (2019)] demonstrated that accurate equilibrium densities of states can also be computed through a nonequilibrium "quench" process, where sampling is performed at a higher temperature to encourage rapid mixing and then quenched to lower energy states with dissipative dynamics. Here, we provide an implementation of the quench dynamics in LAMMPS and evaluate a new formulation of nonequilibrium estimators for the computation of partition functions or free energy surfaces (FESs) of molecular systems. We show that the method is exact for a minimal model of N-independent harmonic springs and use these analytical results to develop heuristics for the amount of quenching required to obtain accurate sampling. We then test the quench approach on alanine dipeptide, where we show that it gives an FES that is accurate near the most stable configurations using the quench approach but disagrees with a reference umbrella sampling calculation in high FE regions. We then show that combining quenching with umbrella sampling allows the efficient calculation of the free energy in all regions. Moreover, by using this combined scheme, we obtain the FES across a range of temperatures at no additional cost, making it much more efficient than standard umbrella sampling if this information is required. Finally, we discuss how this approach can be extended to solute tempering and demonstrate that it is highly accurate for the case of solvated alanine dipeptide without any additional modifications.
View details for DOI 10.1063/5.0176700
View details for PubMedID 38240301
-
Statistical Spatially Inhomogeneous Diffusion Inference
ASSOC ADVANCEMENT ARTIFICIAL INTELLIGENCE. 2024: 14820-14828
View details for Web of Science ID 001239979300085
-
Ensuring thermodynamic consistency with invertible coarse-graining.
The Journal of chemical physics
2023; 158 (12): 124126
Abstract
Coarse-grained models are a core computational tool in theoretical chemistry and biophysics. A judicious choice of a coarse-grained model can yield physical insights by isolating the essential degrees of freedom that dictate the thermodynamic properties of a complex, condensed-phase system. The reduced complexity of the model typically leads to lower computational costs and more efficient sampling compared with atomistic models. Designing "good" coarse-grained models is an art. Generally, the mapping from fine-grained configurations to coarse-grained configurations itself is not optimized in any way; instead, the energy function associated with the mapped configurations is. In this work, we explore the consequences of optimizing the coarse-grained representation alongside its potential energy function. We use a graph machine learning framework to embed atomic configurations into a low-dimensional space to produce efficient representations of the original molecular system. Because the representation we obtain is no longer directly interpretable as a real-space representation of the atomic coordinates, we also introduce an inversion process and an associated thermodynamic consistency relation that allows us to rigorously sample fine-grained configurations conditioned on the coarse-grained sampling. We show that this technique is robust, recovering the first two moments of the distribution of several observables in proteins such as chignolin and alanine dipeptide.
View details for DOI 10.1063/5.0141888
View details for PubMedID 37003724
-
Unified, Geometric Framework for Nonequilibrium Protocol Optimization.
Physical review letters
2023; 130 (10): 107101
Abstract
Controlling thermodynamic cycles to minimize the dissipated heat is a long-standing goal in thermodynamics, and more recently, a central challenge in stochastic thermodynamics for nanoscale systems. Here, we introduce a theoretical and computational framework for optimizing nonequilibrium control protocols that can transform a system between two distributions in a minimally dissipative fashion. These protocols optimally transport a system along paths through the space of probability distributions that minimize the dissipative cost of a transformation. Furthermore, we show that the thermodynamic metric-determined via a linear response approach-can be directly derived from the same objective function that is optimized in the optimal transport problem, thus providing a unified perspective on thermodynamic geometries. We investigate this unified geometric framework in two model systems and observe that our procedure for optimizing control protocols is robust beyond linear response.
View details for DOI 10.1103/PhysRevLett.130.107101
View details for PubMedID 36962015
-
Trainability and Accuracy of Artificial Neural Networks: An Interacting Particle System Approach
COMMUNICATIONS ON PURE AND APPLIED MATHEMATICS
2022; 75 (9): 1889-1935
View details for DOI 10.1002/cpa.22074
View details for Web of Science ID 000828449900002
-
Physics-informed graph neural networks enhance scalability of variational nonequilibrium optimal control
JOURNAL OF CHEMICAL PHYSICS
2022; 157 (7): 074101
Abstract
When a physical system is driven away from equilibrium, the statistical distribution of its dynamical trajectories informs many of its physical properties. Characterizing the nature of the distribution of dynamical observables, such as a current or entropy production rate, has become a central problem in nonequilibrium statistical mechanics. Asymptotically, for a broad class of observables, the distribution of a given observable satisfies a large deviation principle when the dynamics is Markovian, meaning that fluctuations can be characterized in the long-time limit by computing a scaled cumulant generating function. Calculating this function is not tractable analytically (nor often numerically) for complex, interacting systems, so the development of robust numerical techniques to carry out this computation is needed to probe the properties of nonequilibrium materials. Here, we describe an algorithm that recasts this task as an optimal control problem that can be solved variationally. We solve for optimal control forces using neural network ansatz that are tailored to the physical systems to which the forces are applied. We demonstrate that this approach leads to transferable and accurate solutions in two systems featuring large numbers of interacting particles.
View details for DOI 10.1063/5.0095593
View details for Web of Science ID 000840971900002
View details for PubMedID 35987599
-
Adaptive Monte Carlo augmented with normalizing flows.
Proceedings of the National Academy of Sciences of the United States of America
2022; 119 (10): e2109420119
Abstract
SignificanceMonte Carlo methods, tools for sampling data from probability distributions, are widely used in the physical sciences, applied mathematics, and Bayesian statistics. Nevertheless, there are many situations in which it is computationally prohibitive to use Monte Carlo due to slow "mixing" between modes of a distribution unless hand-tuned algorithms are used to accelerate the scheme. Machine learning techniques based on generative models offer a compelling alternative to the challenge of designing efficient schemes for a specific system. Here, we formalize Monte Carlo augmented with normalizing flows and show that, with limited prior data and a physically inspired algorithm, we can substantially accelerate sampling with generative models.
View details for DOI 10.1073/pnas.2109420119
View details for PubMedID 35235453
-
Learning nonequilibrium control forces to characterize dynamical phase transitions
PHYSICAL REVIEW E
2022; 105 (2)
View details for DOI 10.1103/PhysRevE.105.024115
View details for Web of Science ID 000754645400008
-
Learning nonequilibrium control forces to characterize dynamical phase transitions.
Physical review. E
2022; 105 (2-1): 024115
Abstract
Sampling the collective, dynamical fluctuations that lead to nonequilibrium pattern formation requires probing rare regions of trajectory space. Recent approaches to this problem, based on importance sampling, cloning, and spectral approximations, have yielded significant insight into nonequilibrium systems but tend to scale poorly with the size of the system, especially near dynamical phase transitions. Here we propose a machine learning algorithm that samples rare trajectories and estimates the associated large deviation functions using a many-body control force by leveraging the flexible function representation provided by deep neural networks, importance sampling in trajectory space, and stochastic optimal control theory. We show that this approach scales to hundreds of interacting particles and remains robust at dynamical phase transitions.
View details for DOI 10.1103/PhysRevE.105.024115
View details for PubMedID 35291069
-
Remembering the Work of Phillip L. Geissler: A Coda to His Scientific Trajectory.
Annual review of physical chemistry
2022
Abstract
Phillip L. Geissler made important contributions to the statistical mechanics of biological polymers, heterogeneous materials, and chemical dynamics in aqueous environments. He devised analytical and computational methods that revealed the underlying organization of complex systems at the frontiers of biology, chemistry, and materials science. In this retrospective we celebrate his work at these frontiers. Expected final online publication date for the Annual Review of Physical Chemistry, Volume 74 is April 2023. Please see http://www.annualreviews.org/page/journal/pubdates for revised estimates.
View details for DOI 10.1146/annurev-physchem-101422-030127
View details for PubMedID 36719975
-
Probing the theoretical and computational limits of dissipative design.
The Journal of chemical physics
2021; 155 (19): 194114
Abstract
Self-assembly, the process by which interacting components form well-defined and often intricate structures, is typically thought of as a spontaneous process arising from equilibrium dynamics. When a system is driven by external nonequilibrium forces, states statistically inaccessible to the equilibrium dynamics can arise, a process sometimes termed direct self-assembly. However, if we fix a given target state and a set of external control variables, it is not well-understood (i) how to designa protocol to drive the system toward the desired state nor (ii) the cost of persistently perturbing the stationary distribution. In this work, we derive a bound that relates the proximity to the chosen target with the dissipation associated with the external drive, showing that high-dimensional external control can guide systems toward target distribution but with an inevitable cost. Remarkably, the bound holds arbitrarily far from equilibrium. Second, we investigate the performance of deep reinforcement learning algorithms and provide evidence for the realizability of complex protocols that stabilize otherwise inaccessible states of matter.
View details for DOI 10.1063/5.0067695
View details for PubMedID 34800948
-
A Dynamical Central Limit Theorem for Shallow Neural Networks
NEURAL INFORMATION PROCESSING SYSTEMS (NIPS). 2020
View details for Web of Science ID 000627697000073