Embedding Capabilities of Neural ODEs
- URL: http://arxiv.org/abs/2308.01213v2
- Date: Thu, 28 Sep 2023 08:52:11 GMT
- Title: Embedding Capabilities of Neural ODEs
- Authors: Christian Kuehn and Sara-Viola Kuntz
- Abstract summary: We study input-output relations of neural ODEs using dynamical systems theory.
We prove several results about the exact embedding of maps in different neural ODE architectures in low and high dimension.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: A class of neural networks that gained particular interest in the last years
are neural ordinary differential equations (neural ODEs). We study input-output
relations of neural ODEs using dynamical systems theory and prove several
results about the exact embedding of maps in different neural ODE architectures
in low and high dimension. The embedding capability of a neural ODE
architecture can be increased by adding, for example, a linear layer, or
augmenting the phase space. Yet, there is currently no systematic theory
available and our work contributes towards this goal by developing various
embedding results as well as identifying situations, where no embedding is
possible. The mathematical techniques used include as main components iterative
functional equations, Morse functions and suspension flows, as well as several
further ideas from analysis. Although practically, mainly universal
approximation theorems are used, our geometric dynamical systems viewpoint on
universal embedding provides a fundamental understanding, why certain neural
ODE architectures perform better than others.
Related papers
- Interpretable Spatio-Temporal Embedding for Brain Structural-Effective Network with Ordinary Differential Equation [56.34634121544929]
In this study, we first construct the brain-effective network via the dynamic causal model.
We then introduce an interpretable graph learning framework termed Spatio-Temporal Embedding ODE (STE-ODE)
This framework incorporates specifically designed directed node embedding layers, aiming at capturing the dynamic interplay between structural and effective networks.
arXiv Detail & Related papers (2024-05-21T20:37:07Z) - Neural Fractional Differential Equations [2.812395851874055]
Fractional Differential Equations (FDEs) are essential tools for modelling complex systems in science and engineering.
We propose the Neural FDE, a novel deep neural network architecture that adjusts a FDE to the dynamics of data.
arXiv Detail & Related papers (2024-03-05T07:45:29Z) - Mechanistic Neural Networks for Scientific Machine Learning [58.99592521721158]
We present Mechanistic Neural Networks, a neural network design for machine learning applications in the sciences.
It incorporates a new Mechanistic Block in standard architectures to explicitly learn governing differential equations as representations.
Central to our approach is a novel Relaxed Linear Programming solver (NeuRLP) inspired by a technique that reduces solving linear ODEs to solving linear programs.
arXiv Detail & Related papers (2024-02-20T15:23:24Z) - Uncertainty and Structure in Neural Ordinary Differential Equations [28.12033356095061]
We show that basic and lightweight Bayesian deep learning techniques like the Laplace approximation can be applied to neural ODEs.
We explore how mechanistic knowledge and uncertainty quantification interact on two recently proposed neural ODE frameworks.
arXiv Detail & Related papers (2023-05-22T17:50:42Z) - Interpretable Polynomial Neural Ordinary Differential Equations [3.04585143845864]
We introduce the neural ODE, which is a deep neural network inside of the neural ODE framework.
We demonstrate the capability of neural ODEs to predict outside of the training region, as well as perform direct symbolic regression without additional tools such as SINDy.
arXiv Detail & Related papers (2022-08-09T23:23:37Z) - Reachability Analysis of a General Class of Neural Ordinary Differential
Equations [7.774796410415387]
Continuous deep learning models, referred to as Neural Ordinary Differential Equations (Neural ODEs), have received considerable attention over the last several years.
Despite their burgeoning impact, there is a lack of formal analysis techniques for these systems.
We introduce a novel reachability framework that allows for the formal analysis of their behavior.
arXiv Detail & Related papers (2022-07-13T22:05:52Z) - Neural Laplace: Learning diverse classes of differential equations in
the Laplace domain [86.52703093858631]
We propose a unified framework for learning diverse classes of differential equations (DEs) including all the aforementioned ones.
Instead of modelling the dynamics in the time domain, we model it in the Laplace domain, where the history-dependencies and discontinuities in time can be represented as summations of complex exponentials.
In the experiments, Neural Laplace shows superior performance in modelling and extrapolating the trajectories of diverse classes of DEs.
arXiv Detail & Related papers (2022-06-10T02:14:59Z) - Neuromorphic Artificial Intelligence Systems [58.1806704582023]
Modern AI systems, based on von Neumann architecture and classical neural networks, have a number of fundamental limitations in comparison with the brain.
This article discusses such limitations and the ways they can be mitigated.
It presents an overview of currently available neuromorphic AI projects in which these limitations are overcome.
arXiv Detail & Related papers (2022-05-25T20:16:05Z) - Realization Theory Of Recurrent Neural ODEs Using Polynomial System
Embeddings [0.802904964931021]
We show that neural ODE analogs of recurrent (RNN) and LongTerm Memory (ODE-LSTM) networks can be algorithmically embedded into the class of systems.
This embedding input-output behavior and can be extended to other DE-LSTM architectures.
We then use realization theory of systems to provide necessary conditions for an input-output to be realizable by an ODE-LSTM and sufficient conditions for minimality of such systems.
arXiv Detail & Related papers (2022-05-24T11:36:18Z) - Universal approximation property of invertible neural networks [76.95927093274392]
Invertible neural networks (INNs) are neural network architectures with invertibility by design.
Thanks to their invertibility and the tractability of Jacobian, INNs have various machine learning applications such as probabilistic modeling, generative modeling, and representation learning.
arXiv Detail & Related papers (2022-04-15T10:45:26Z) - On Second Order Behaviour in Augmented Neural ODEs [69.8070643951126]
We consider Second Order Neural ODEs (SONODEs)
We show how the adjoint sensitivity method can be extended to SONODEs.
We extend the theoretical understanding of the broader class of Augmented NODEs (ANODEs)
arXiv Detail & Related papers (2020-06-12T14:25:31Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.