Learning Pair Potentials using Differentiable Simulations
- URL: http://arxiv.org/abs/2209.07679v1
- Date: Fri, 16 Sep 2022 02:36:02 GMT
- Title: Learning Pair Potentials using Differentiable Simulations
- Authors: Wujie Wang, Zhenghao Wu, Rafael G\'omez-Bombarelli
- Abstract summary: We propose a general method for learning pair interactions from data using differentiable simulations (DiffSim)
DiffSim defines a loss function based on structural observables, such as the radial distribution function, through molecular dynamics (MD) simulations.
The interaction potentials are then learned directly by gradient descent, using backpropagation to calculate the gradient of the structural loss metric.
- Score: 1.9950682531209156
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Learning pair interactions from experimental or simulation data is of great
interest for molecular simulations. We propose a general stochastic method for
learning pair interactions from data using differentiable simulations
(DiffSim). DiffSim defines a loss function based on structural observables,
such as the radial distribution function, through molecular dynamics (MD)
simulations. The interaction potentials are then learned directly by stochastic
gradient descent, using backpropagation to calculate the gradient of the
structural loss metric with respect to the interaction potential through the MD
simulation. This gradient-based method is flexible and can be configured to
simulate and optimize multiple systems simultaneously. For example, it is
possible to simultaneously learn potentials for different temperatures or for
different compositions. We demonstrate the approach by recovering simple pair
potentials, such as Lennard-Jones systems, from radial distribution functions.
We find that DiffSim can be used to probe a wider functional space of pair
potentials compared to traditional methods like Iterative Boltzmann Inversion.
We show that our methods can be used to simultaneously fit potentials for
simulations at different compositions and temperatures to improve the
transferability of the learned potentials.
Related papers
- A Multi-Grained Symmetric Differential Equation Model for Learning Protein-Ligand Binding Dynamics [73.35846234413611]
In drug discovery, molecular dynamics (MD) simulation provides a powerful tool for predicting binding affinities, estimating transport properties, and exploring pocket sites.
We propose NeuralMD, the first machine learning (ML) surrogate that can facilitate numerical MD and provide accurate simulations in protein-ligand binding dynamics.
We demonstrate the efficiency and effectiveness of NeuralMD, achieving over 1K$times$ speedup compared to standard numerical MD simulations.
arXiv Detail & Related papers (2024-01-26T09:35:17Z) - Simulation-based inference using surjective sequential neural likelihood
estimation [50.24983453990065]
Surjective Sequential Neural Likelihood estimation is a novel method for simulation-based inference.
By embedding the data in a low-dimensional space, SSNL solves several issues previous likelihood-based methods had when applied to high-dimensional data sets.
arXiv Detail & Related papers (2023-08-02T10:02:38Z) - Machine learning of hidden variables in multiscale fluid simulation [77.34726150561087]
Solving fluid dynamics equations often requires the use of closure relations that account for missing microphysics.
In our study, a partial differential equation simulator that is end-to-end differentiable is used to train judiciously placed neural networks.
We show that this method enables an equation based approach to reproduce non-linear, large Knudsen number plasma physics.
arXiv Detail & Related papers (2023-06-19T06:02:53Z) - Improving Gradient Computation for Differentiable Physics Simulation
with Contacts [10.450509067356148]
We study differentiable rigid-body simulation with contacts.
We propose to improve gradient computation by continuous collision detection and leverage the time-of-impact (TOI)
We show that with TOI-Ve, we are able to learn an optimal control sequence that matches the analytical solution.
arXiv Detail & Related papers (2023-04-28T21:10:16Z) - Physics-constrained neural differential equations for learning
multi-ionic transport [0.0]
We develop the first physics-informed deep learning model to learn ion transport behaviour across polyamide nanopores.
We use neural differential equations in conjunction with classical closure models as inductive biases directly into the neural framework.
arXiv Detail & Related papers (2023-03-07T17:18:52Z) - Maximum Likelihood Learning of Unnormalized Models for Simulation-Based
Inference [44.281860162298564]
We introduce two synthetic likelihood methods for Simulation-Based Inference.
We learn a conditional energy-based model (EBM) of the likelihood using synthetic data generated by the simulator.
We demonstrate the properties of both methods on a range of synthetic datasets, and apply them to a model of the neuroscience network in the crab.
arXiv Detail & Related papers (2022-10-26T14:38:24Z) - Differentiable Agent-based Epidemiology [71.81552021144589]
We introduce GradABM: a scalable, differentiable design for agent-based modeling that is amenable to gradient-based learning with automatic differentiation.
GradABM can quickly simulate million-size populations in few seconds on commodity hardware, integrate with deep neural networks and ingest heterogeneous data sources.
arXiv Detail & Related papers (2022-07-20T07:32:02Z) - Neural Posterior Estimation with Differentiable Simulators [58.720142291102135]
We present a new method to perform Neural Posterior Estimation (NPE) with a differentiable simulator.
We demonstrate how gradient information helps constrain the shape of the posterior and improves sample-efficiency.
arXiv Detail & Related papers (2022-07-12T16:08:04Z) - MINIMALIST: Mutual INformatIon Maximization for Amortized Likelihood
Inference from Sampled Trajectories [61.3299263929289]
Simulation-based inference enables learning the parameters of a model even when its likelihood cannot be computed in practice.
One class of methods uses data simulated with different parameters to infer an amortized estimator for the likelihood-to-evidence ratio.
We show that this approach can be formulated in terms of mutual information between model parameters and simulated data.
arXiv Detail & Related papers (2021-06-03T12:59:16Z) - Data-Driven Discovery of Coarse-Grained Equations [0.0]
Multiscale modeling and simulations are two areas where learning on simulated data can lead to such discovery.
We replace the human discovery of such models with a machine-learning strategy based on sparse regression that can be executed in two modes.
A series of examples demonstrates the accuracy, robustness, and limitations of our approach to equation discovery.
arXiv Detail & Related papers (2020-01-30T23:41:37Z) - Simulation Assisted Likelihood-free Anomaly Detection [3.479254848034425]
This paper introduces a hybrid method that makes the best of both approaches to model-independent searches.
For potential signals that are resonant in one known feature, this new method first learns a parameterized reweighting function to morph a given simulation to match the data in sidebands.
The background estimation from the reweighted simulation allows for non-trivial correlations between features used for classification and the resonant feature.
arXiv Detail & Related papers (2020-01-14T19:00:09Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.