Neural Differential Algebraic Equations
- URL: http://arxiv.org/abs/2403.12938v1
- Date: Tue, 19 Mar 2024 17:43:57 GMT
- Title: Neural Differential Algebraic Equations
- Authors: James Koch, Madelyn Shapiro, Himanshu Sharma, Draguna Vrabie, Jan Drgona,
- Abstract summary: We present Neural Differential-Algebraic Equations (NDAEs) suitable for data-driven modeling of Differential-Algebraic Equations (DAEs)
We show that the proposed NDAEs abstraction is suitable for relevant system-theoretic data-driven modeling tasks.
- Score: 6.100037457394823
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Differential-Algebraic Equations (DAEs) describe the temporal evolution of systems that obey both differential and algebraic constraints. Of particular interest are systems that contain implicit relationships between their components, such as conservation relationships. Here, we present Neural Differential-Algebraic Equations (NDAEs) suitable for data-driven modeling of DAEs. This methodology is built upon the concept of the Universal Differential Equation; that is, a model constructed as a system of Neural Ordinary Differential Equations informed by theory from particular science domains. In this work, we show that the proposed NDAEs abstraction is suitable for relevant system-theoretic data-driven modeling tasks. Presented examples include (i) the inverse problem of tank-manifold dynamics and (ii) discrepancy modeling of a network of pumps, tanks, and pipes. Our experiments demonstrate the proposed method's robustness to noise and extrapolation ability to (i) learn the behaviors of the system components and their interaction physics and (ii) disambiguate between data trends and mechanistic relationships contained in the system.
Related papers
- Learning Controlled Stochastic Differential Equations [61.82896036131116]
This work proposes a novel method for estimating both drift and diffusion coefficients of continuous, multidimensional, nonlinear controlled differential equations with non-uniform diffusion.
We provide strong theoretical guarantees, including finite-sample bounds for (L2), (Linfty), and risk metrics, with learning rates adaptive to coefficients' regularity.
Our method is available as an open-source Python library.
arXiv Detail & Related papers (2024-11-04T11:09:58Z) - Deep Generative Modeling for Identification of Noisy, Non-Stationary Dynamical Systems [3.1484174280822845]
We focus on finding parsimonious ordinary differential equation (ODE) models for nonlinear, noisy, and non-autonomous dynamical systems.
Our method, dynamic SINDy, combines variational inference with SINDy (sparse identification of nonlinear dynamics) to model time-varying coefficients of sparse ODEs.
arXiv Detail & Related papers (2024-10-02T23:00:00Z) - Identifiability and Asymptotics in Learning Homogeneous Linear ODE Systems from Discrete Observations [114.17826109037048]
Ordinary Differential Equations (ODEs) have recently gained a lot of attention in machine learning.
theoretical aspects, e.g., identifiability and properties of statistical estimation are still obscure.
This paper derives a sufficient condition for the identifiability of homogeneous linear ODE systems from a sequence of equally-spaced error-free observations sampled from a single trajectory.
arXiv Detail & Related papers (2022-10-12T06:46:38Z) - D-CIPHER: Discovery of Closed-form Partial Differential Equations [80.46395274587098]
We propose D-CIPHER, which is robust to measurement artifacts and can uncover a new and very general class of differential equations.
We further design a novel optimization procedure, CoLLie, to help D-CIPHER search through this class efficiently.
arXiv Detail & Related papers (2022-06-21T17:59:20Z) - Neural Laplace: Learning diverse classes of differential equations in
the Laplace domain [86.52703093858631]
We propose a unified framework for learning diverse classes of differential equations (DEs) including all the aforementioned ones.
Instead of modelling the dynamics in the time domain, we model it in the Laplace domain, where the history-dependencies and discontinuities in time can be represented as summations of complex exponentials.
In the experiments, Neural Laplace shows superior performance in modelling and extrapolating the trajectories of diverse classes of DEs.
arXiv Detail & Related papers (2022-06-10T02:14:59Z) - Capturing Actionable Dynamics with Structured Latent Ordinary
Differential Equations [68.62843292346813]
We propose a structured latent ODE model that captures system input variations within its latent representation.
Building on a static variable specification, our model learns factors of variation for each input to the system, thus separating the effects of the system inputs in the latent space.
arXiv Detail & Related papers (2022-02-25T20:00:56Z) - On Neural Differential Equations [13.503274710499971]
In particular, neural differential equations (NDEs) demonstrate that neural networks and differential equations are two sides of the same coin.
NDEs are suitable for tackling generative problems, dynamical systems, and time series.
NDEs offer high-capacity function approximation, strong priors on model space, the ability to handle irregular data, memory efficiency, and a wealth of available theory on both sides.
arXiv Detail & Related papers (2022-02-04T23:32:29Z) - Continuous Convolutional Neural Networks: Coupled Neural PDE and ODE [1.1897857181479061]
This work proposes a variant of Convolutional Neural Networks (CNNs) that can learn the hidden dynamics of a physical system.
Instead of considering the physical system such as image, time -series as a system of multiple layers, this new technique can model a system in the form of Differential Equation (DEs)
arXiv Detail & Related papers (2021-10-30T21:45:00Z) - Multi-objective discovery of PDE systems using evolutionary approach [77.34726150561087]
In the paper, a multi-objective co-evolution algorithm is described.
The single equations within the system and the system itself are evolved simultaneously to obtain the system.
In contrast to the single vector equation, a component-wise system is more suitable for expert interpretation and, therefore, for applications.
arXiv Detail & Related papers (2021-03-11T15:37:52Z) - Physics-informed learning of governing equations from scarce data [14.95055620484844]
This work introduces a physics-informed deep learning framework to discover governing partial differential equations (PDEs) from scarce and noisy representation data.
The efficacy and robustness of this method are demonstrated, both numerically and experimentally, on discovering a variety of PDE systems.
The resulting computational framework shows the potential for closed-form model discovery in practical applications.
arXiv Detail & Related papers (2020-05-05T22:13:22Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.