Constrained Neural Ordinary Differential Equations with Stability
Guarantees
- URL: http://arxiv.org/abs/2004.10883v1
- Date: Wed, 22 Apr 2020 22:07:57 GMT
- Title: Constrained Neural Ordinary Differential Equations with Stability
Guarantees
- Authors: Aaron Tuor, Jan Drgona, Draguna Vrabie
- Abstract summary: We show how to model discrete ordinary differential equations with algebraic nonlinearities as deep neural networks.
We derive the stability guarantees of the network layers based on the implicit constraints imposed on the weight's eigenvalues.
We demonstrate the prediction accuracy of learned neural ODEs evaluated on open-loop simulations.
- Score: 1.1086440815804224
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Differential equations are frequently used in engineering domains, such as
modeling and control of industrial systems, where safety and performance
guarantees are of paramount importance. Traditional physics-based modeling
approaches require domain expertise and are often difficult to tune or adapt to
new systems. In this paper, we show how to model discrete ordinary differential
equations (ODE) with algebraic nonlinearities as deep neural networks with
varying degrees of prior knowledge. We derive the stability guarantees of the
network layers based on the implicit constraints imposed on the weight's
eigenvalues. Moreover, we show how to use barrier methods to generically handle
additional inequality constraints. We demonstrate the prediction accuracy of
learned neural ODEs evaluated on open-loop simulations compared to ground truth
dynamics with bi-linear terms.
Related papers
- Learning Controlled Stochastic Differential Equations [61.82896036131116]
This work proposes a novel method for estimating both drift and diffusion coefficients of continuous, multidimensional, nonlinear controlled differential equations with non-uniform diffusion.
We provide strong theoretical guarantees, including finite-sample bounds for (L2), (Linfty), and risk metrics, with learning rates adaptive to coefficients' regularity.
Our method is available as an open-source Python library.
arXiv Detail & Related papers (2024-11-04T11:09:58Z) - Projected Neural Differential Equations for Learning Constrained Dynamics [3.570367665112327]
We introduce a new method for constraining neural differential equations based on projection of the learned vector field to the tangent space of the constraint manifold.
PNDEs outperform existing methods while requiring fewer hyper parameters.
The proposed approach demonstrates significant potential for enhancing the modeling of constrained dynamical systems.
arXiv Detail & Related papers (2024-10-31T06:32:43Z) - Graph Neural PDE Solvers with Conservation and Similarity-Equivariance [6.077284832583712]
This study introduces a novel machine-learning architecture that is highly generalizable and adheres to conservation laws and physical symmetries.
The foundation of this architecture is graph neural networks (GNNs), which are adept at accommodating a variety of shapes and forms.
arXiv Detail & Related papers (2024-05-25T11:18:27Z) - Stabilized Neural Differential Equations for Learning Dynamics with
Explicit Constraints [4.656302602746229]
We propose stabilized neural differential equations (SNDEs) to enforce arbitrary manifold constraints for neural differential equations.
Our approach is based on a stabilization term that, when added to the original dynamics, renders the constraint manifold provably stable.
Due to its simplicity, our method is compatible with all common neural differential equation (NDE) models and broadly applicable.
arXiv Detail & Related papers (2023-06-16T10:16:59Z) - Learning Neural Constitutive Laws From Motion Observations for
Generalizable PDE Dynamics [97.38308257547186]
Many NN approaches learn an end-to-end model that implicitly models both the governing PDE and material models.
We argue that the governing PDEs are often well-known and should be explicitly enforced rather than learned.
We introduce a new framework termed "Neural Constitutive Laws" (NCLaw) which utilizes a network architecture that strictly guarantees standard priors.
arXiv Detail & Related papers (2023-04-27T17:42:24Z) - On Robust Numerical Solver for ODE via Self-Attention Mechanism [82.95493796476767]
We explore training efficient and robust AI-enhanced numerical solvers with a small data size by mitigating intrinsic noise disturbances.
We first analyze the ability of the self-attention mechanism to regulate noise in supervised learning and then propose a simple-yet-effective numerical solver, Attr, which introduces an additive self-attention mechanism to the numerical solution of differential equations.
arXiv Detail & Related papers (2023-02-05T01:39:21Z) - Generalized Neural Closure Models with Interpretability [28.269731698116257]
We develop a novel and versatile methodology of unified neural partial delay differential equations.
We augment existing/low-fidelity dynamical models directly in their partial differential equation (PDE) forms with both Markovian and non-Markovian neural network (NN) closure parameterizations.
We demonstrate the new generalized neural closure models (gnCMs) framework using four sets of experiments based on advecting nonlinear waves, shocks, and ocean acidification models.
arXiv Detail & Related papers (2023-01-15T21:57:43Z) - Identifiability and Asymptotics in Learning Homogeneous Linear ODE Systems from Discrete Observations [114.17826109037048]
Ordinary Differential Equations (ODEs) have recently gained a lot of attention in machine learning.
theoretical aspects, e.g., identifiability and properties of statistical estimation are still obscure.
This paper derives a sufficient condition for the identifiability of homogeneous linear ODE systems from a sequence of equally-spaced error-free observations sampled from a single trajectory.
arXiv Detail & Related papers (2022-10-12T06:46:38Z) - On the Forward Invariance of Neural ODEs [92.07281135902922]
We propose a new method to ensure neural ordinary differential equations (ODEs) satisfy output specifications.
Our approach uses a class of control barrier functions to transform output specifications into constraints on the parameters and inputs of the learning system.
arXiv Detail & Related papers (2022-10-10T15:18:28Z) - Structure-Preserving Learning Using Gaussian Processes and Variational
Integrators [62.31425348954686]
We propose the combination of a variational integrator for the nominal dynamics of a mechanical system and learning residual dynamics with Gaussian process regression.
We extend our approach to systems with known kinematic constraints and provide formal bounds on the prediction uncertainty.
arXiv Detail & Related papers (2021-12-10T11:09:29Z) - OnsagerNet: Learning Stable and Interpretable Dynamics using a
Generalized Onsager Principle [19.13913681239968]
We learn stable and physically interpretable dynamical models using sampled trajectory data from physical processes based on a generalized Onsager principle.
We further apply this method to study Rayleigh-Benard convection and learn Lorenz-like low dimensional autonomous reduced order models.
arXiv Detail & Related papers (2020-09-06T07:30:59Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.