Learning the action for long-time-step simulations of molecular dynamics
- URL: http://arxiv.org/abs/2508.01068v1
- Date: Fri, 01 Aug 2025 21:01:05 GMT
- Title: Learning the action for long-time-step simulations of molecular dynamics
- Authors: Filippo Bigi, Michele Ceriotti,
- Abstract summary: We show that an action-derived ML integrator eliminates the pathological behavior of non-structure-preserving ML predictors.<n>We show that the method can be applied iteratively, serving as a correction to computationally cheaper direct predictors.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The equations of classical mechanics can be used to model the time evolution of countless physical systems, from the astrophysical to the atomic scale. Accurate numerical integration requires small time steps, which limits the computational efficiency -- especially in cases such as molecular dynamics that span wildly different time scales. Using machine-learning (ML) algorithms to predict trajectories allows one to greatly extend the integration time step, at the cost of introducing artifacts such as lack of energy conservation and loss of equipartition between different degrees of freedom of a system. We propose learning data-driven structure-preserving (symplectic and time-reversible) maps to generate long-time-step classical dynamics, showing that this method is equivalent to learning the mechanical action of the system of interest. We show that an action-derived ML integrator eliminates the pathological behavior of non-structure-preserving ML predictors, and that the method can be applied iteratively, serving as a correction to computationally cheaper direct predictors.
Related papers
- Enhancing Computational Efficiency in Multiscale Systems Using Deep Learning of Coordinates and Flow Maps [0.0]
This paper showcases how deep learning techniques can be used to develop a precise time-stepping approach for multiscale systems.
The resulting framework achieves state-of-the-art predictive accuracy while incurring lesser computational costs.
arXiv Detail & Related papers (2024-04-28T14:05:13Z) - Electronic excited states from physically-constrained machine learning [0.0]
We present an integrated modeling approach, in which a symmetry-adapted ML model of an effective Hamiltonian is trained to reproduce electronic excitations from a quantum-mechanical calculation.
The resulting model can make predictions for molecules that are much larger and more complex than those that it is trained on.
arXiv Detail & Related papers (2023-11-01T20:49:59Z) - Learning Interatomic Potentials at Multiple Scales [1.2162698943818964]
The need to use a short time step is a key limit on the speed of molecular dynamics (MD) simulations.
This work introduces a method to learn a scale separation in complex interatomic interactions by co-training two MLIPs.
arXiv Detail & Related papers (2023-10-20T18:34:32Z) - Stabilizing Machine Learning Prediction of Dynamics: Noise and
Noise-inspired Regularization [58.720142291102135]
Recent has shown that machine learning (ML) models can be trained to accurately forecast the dynamics of chaotic dynamical systems.
In the absence of mitigating techniques, this technique can result in artificially rapid error growth, leading to inaccurate predictions and/or climate instability.
We introduce Linearized Multi-Noise Training (LMNT), a regularization technique that deterministically approximates the effect of many small, independent noise realizations added to the model input during training.
arXiv Detail & Related papers (2022-11-09T23:40:52Z) - Dynamic Bayesian Learning for Spatiotemporal Mechanistic Models [5.658544381300127]
We show an approach for Bayesian learning of mechanistic dynamical models.<n>Such learning consists of statistical emulation of the mechanistic system.<n>The emulated learner can then be used to train the system from noisy data.
arXiv Detail & Related papers (2022-08-12T23:17:46Z) - Semi-supervised Learning of Partial Differential Operators and Dynamical
Flows [68.77595310155365]
We present a novel method that combines a hyper-network solver with a Fourier Neural Operator architecture.
We test our method on various time evolution PDEs, including nonlinear fluid flows in one, two, and three spatial dimensions.
The results show that the new method improves the learning accuracy at the time point of supervision point, and is able to interpolate and the solutions to any intermediate time.
arXiv Detail & Related papers (2022-07-28T19:59:14Z) - Using Machine Learning to Anticipate Tipping Points and Extrapolate to
Post-Tipping Dynamics of Non-Stationary Dynamical Systems [0.0]
We consider the machine learning task of predicting tipping point transitions and long-term post-tipping-point behavior.
We investigate the extent to which ML methods are capable of accomplishing useful results for this task, as well as conditions under which they fail.
The main conclusion of this paper is that ML-based approaches are promising tools for predicting the behavior of non-stationary dynamical systems.
arXiv Detail & Related papers (2022-07-01T16:06:12Z) - Structure-Preserving Learning Using Gaussian Processes and Variational
Integrators [62.31425348954686]
We propose the combination of a variational integrator for the nominal dynamics of a mechanical system and learning residual dynamics with Gaussian process regression.
We extend our approach to systems with known kinematic constraints and provide formal bounds on the prediction uncertainty.
arXiv Detail & Related papers (2021-12-10T11:09:29Z) - Reduced Dynamics of Full Counting Statistics [0.0]
We present a theory of modified reduced dynamics in the presence of counting fields.
We show that the long-lived full counting statistics can be efficiently obtained from the reduced dynamics.
arXiv Detail & Related papers (2021-11-16T14:55:30Z) - Using Data Assimilation to Train a Hybrid Forecast System that Combines
Machine-Learning and Knowledge-Based Components [52.77024349608834]
We consider the problem of data-assisted forecasting of chaotic dynamical systems when the available data is noisy partial measurements.
We show that by using partial measurements of the state of the dynamical system, we can train a machine learning model to improve predictions made by an imperfect knowledge-based model.
arXiv Detail & Related papers (2021-02-15T19:56:48Z) - Fast and differentiable simulation of driven quantum systems [58.720142291102135]
We introduce a semi-analytic method based on the Dyson expansion that allows us to time-evolve driven quantum systems much faster than standard numerical methods.
We show results of the optimization of a two-qubit gate using transmon qubits in the circuit QED architecture.
arXiv Detail & Related papers (2020-12-16T21:43:38Z) - Time-Reversal Symmetric ODE Network [138.02741983098454]
Time-reversal symmetry is a fundamental property that frequently holds in classical and quantum mechanics.
We propose a novel loss function that measures how well our ordinary differential equation (ODE) networks comply with this time-reversal symmetry.
We show that, even for systems that do not possess the full time-reversal symmetry, TRS-ODENs can achieve better predictive performances over baselines.
arXiv Detail & Related papers (2020-07-22T12:19:40Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.