Stabilized Neural Differential Equations for Learning Dynamics with
Explicit Constraints
- URL: http://arxiv.org/abs/2306.09739v3
- Date: Thu, 15 Feb 2024 16:47:31 GMT
- Title: Stabilized Neural Differential Equations for Learning Dynamics with
Explicit Constraints
- Authors: Alistair White, Niki Kilbertus, Maximilian Gelbrecht, Niklas Boers
- Abstract summary: We propose stabilized neural differential equations (SNDEs) to enforce arbitrary manifold constraints for neural differential equations.
Our approach is based on a stabilization term that, when added to the original dynamics, renders the constraint manifold provably stable.
Due to its simplicity, our method is compatible with all common neural differential equation (NDE) models and broadly applicable.
- Score: 4.656302602746229
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Many successful methods to learn dynamical systems from data have recently
been introduced. However, ensuring that the inferred dynamics preserve known
constraints, such as conservation laws or restrictions on the allowed system
states, remains challenging. We propose stabilized neural differential
equations (SNDEs), a method to enforce arbitrary manifold constraints for
neural differential equations. Our approach is based on a stabilization term
that, when added to the original dynamics, renders the constraint manifold
provably asymptotically stable. Due to its simplicity, our method is compatible
with all common neural differential equation (NDE) models and broadly
applicable. In extensive empirical evaluations, we demonstrate that SNDEs
outperform existing methods while broadening the types of constraints that can
be incorporated into NDE training.
Related papers
- Learning Controlled Stochastic Differential Equations [61.82896036131116]
This work proposes a novel method for estimating both drift and diffusion coefficients of continuous, multidimensional, nonlinear controlled differential equations with non-uniform diffusion.
We provide strong theoretical guarantees, including finite-sample bounds for (L2), (Linfty), and risk metrics, with learning rates adaptive to coefficients' regularity.
Our method is available as an open-source Python library.
arXiv Detail & Related papers (2024-11-04T11:09:58Z) - Projected Neural Differential Equations for Learning Constrained Dynamics [3.570367665112327]
We introduce a new method for constraining neural differential equations based on projection of the learned vector field to the tangent space of the constraint manifold.
PNDEs outperform existing methods while requiring fewer hyper parameters.
The proposed approach demonstrates significant potential for enhancing the modeling of constrained dynamical systems.
arXiv Detail & Related papers (2024-10-31T06:32:43Z) - Deep Generative Modeling for Identification of Noisy, Non-Stationary Dynamical Systems [3.1484174280822845]
We focus on finding parsimonious ordinary differential equation (ODE) models for nonlinear, noisy, and non-autonomous dynamical systems.
Our method, dynamic SINDy, combines variational inference with SINDy (sparse identification of nonlinear dynamics) to model time-varying coefficients of sparse ODEs.
arXiv Detail & Related papers (2024-10-02T23:00:00Z) - Modeling Latent Neural Dynamics with Gaussian Process Switching Linear Dynamical Systems [2.170477444239546]
We develop an approach that balances these two objectives: the Gaussian Process Switching Linear Dynamical System (gpSLDS)
Our method builds on previous work modeling the latent state evolution via a differential equation whose nonlinear dynamics are described by a Gaussian process (GP-SDEs)
Our approach resolves key limitations of the rSLDS such as artifactual oscillations in dynamics near discrete state boundaries, while also providing posterior uncertainty estimates of the dynamics.
arXiv Detail & Related papers (2024-07-19T15:32:15Z) - Guaranteed Conservation of Momentum for Learning Particle-based Fluid
Dynamics [96.9177297872723]
We present a novel method for guaranteeing linear momentum in learned physics simulations.
We enforce conservation of momentum with a hard constraint, which we realize via antisymmetrical continuous convolutional layers.
In combination, the proposed method allows us to increase the physical accuracy of the learned simulator substantially.
arXiv Detail & Related papers (2022-10-12T09:12:59Z) - Decimation technique for open quantum systems: a case study with
driven-dissipative bosonic chains [62.997667081978825]
Unavoidable coupling of quantum systems to external degrees of freedom leads to dissipative (non-unitary) dynamics.
We introduce a method to deal with these systems based on the calculation of (dissipative) lattice Green's function.
We illustrate the power of this method with several examples of driven-dissipative bosonic chains of increasing complexity.
arXiv Detail & Related papers (2022-02-15T19:00:09Z) - Message Passing Neural PDE Solvers [60.77761603258397]
We build a neural message passing solver, replacing allally designed components in the graph with backprop-optimized neural function approximators.
We show that neural message passing solvers representationally contain some classical methods, such as finite differences, finite volumes, and WENO schemes.
We validate our method on various fluid-like flow problems, demonstrating fast, stable, and accurate performance across different domain topologies, equation parameters, discretizations, etc., in 1D and 2D.
arXiv Detail & Related papers (2022-02-07T17:47:46Z) - Structure-Preserving Learning Using Gaussian Processes and Variational
Integrators [62.31425348954686]
We propose the combination of a variational integrator for the nominal dynamics of a mechanical system and learning residual dynamics with Gaussian process regression.
We extend our approach to systems with known kinematic constraints and provide formal bounds on the prediction uncertainty.
arXiv Detail & Related papers (2021-12-10T11:09:29Z) - Training Generative Adversarial Networks by Solving Ordinary
Differential Equations [54.23691425062034]
We study the continuous-time dynamics induced by GAN training.
From this perspective, we hypothesise that instabilities in training GANs arise from the integration error.
We experimentally verify that well-known ODE solvers (such as Runge-Kutta) can stabilise training.
arXiv Detail & Related papers (2020-10-28T15:23:49Z) - Constrained Neural Ordinary Differential Equations with Stability
Guarantees [1.1086440815804224]
We show how to model discrete ordinary differential equations with algebraic nonlinearities as deep neural networks.
We derive the stability guarantees of the network layers based on the implicit constraints imposed on the weight's eigenvalues.
We demonstrate the prediction accuracy of learned neural ODEs evaluated on open-loop simulations.
arXiv Detail & Related papers (2020-04-22T22:07:57Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.