Learning ODEs via Diffeomorphisms for Fast and Robust Integration
- URL: http://arxiv.org/abs/2107.01650v1
- Date: Sun, 4 Jul 2021 14:32:16 GMT
- Title: Learning ODEs via Diffeomorphisms for Fast and Robust Integration
- Authors: Weiming Zhi, Tin Lai, Lionel Ott, Edwin V. Bonilla, Fabio Ramos
- Abstract summary: Differentiable solvers are central for learning Neural ODEs.
We propose an alternative approach to learning ODEs from data.
We observe improvements of up to two orders of magnitude when integrating learned ODEs with gradient.
- Score: 40.52862415144424
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Advances in differentiable numerical integrators have enabled the use of
gradient descent techniques to learn ordinary differential equations (ODEs). In
the context of machine learning, differentiable solvers are central for Neural
ODEs (NODEs), a class of deep learning models with continuous depth, rather
than discrete layers. However, these integrators can be unsatisfactorily slow
and inaccurate when learning systems of ODEs from long sequences, or when
solutions of the system vary at widely different timescales in each dimension.
In this paper we propose an alternative approach to learning ODEs from data: we
represent the underlying ODE as a vector field that is related to another base
vector field by a differentiable bijection, modelled by an invertible neural
network. By restricting the base ODE to be amenable to integration, we can
drastically speed up and improve the robustness of integration. We demonstrate
the efficacy of our method in training and evaluating continuous neural
networks models, as well as in learning benchmark ODE systems. We observe
improvements of up to two orders of magnitude when integrating learned ODEs
with GPUs computation.
Related papers
- Faster Training of Neural ODEs Using Gau{\ss}-Legendre Quadrature [68.9206193762751]
We propose an alternative way to speed up the training of neural ODEs.
We use Gauss-Legendre quadrature to solve integrals faster than ODE-based methods.
We also extend the idea to training SDEs using the Wong-Zakai theorem, by training a corresponding ODE and transferring the parameters.
arXiv Detail & Related papers (2023-08-21T11:31:15Z) - Experimental study of Neural ODE training with adaptive solver for
dynamical systems modeling [72.84259710412293]
Some ODE solvers called adaptive can adapt their evaluation strategy depending on the complexity of the problem at hand.
This paper describes a simple set of experiments to show why adaptive solvers cannot be seamlessly leveraged as a black-box for dynamical systems modelling.
arXiv Detail & Related papers (2022-11-13T17:48:04Z) - On Numerical Integration in Neural Ordinary Differential Equations [0.0]
We propose the inverse modified differential equations (IMDE) to clarify the influence of numerical integration on training Neural ODE models.
It is shown that training a Neural ODE model actually returns a close approximation of the IMDE, rather than the true ODE.
arXiv Detail & Related papers (2022-06-15T07:39:01Z) - Neural Laplace: Learning diverse classes of differential equations in
the Laplace domain [86.52703093858631]
We propose a unified framework for learning diverse classes of differential equations (DEs) including all the aforementioned ones.
Instead of modelling the dynamics in the time domain, we model it in the Laplace domain, where the history-dependencies and discontinuities in time can be represented as summations of complex exponentials.
In the experiments, Neural Laplace shows superior performance in modelling and extrapolating the trajectories of diverse classes of DEs.
arXiv Detail & Related papers (2022-06-10T02:14:59Z) - A memory-efficient neural ODE framework based on high-level adjoint
differentiation [4.063868707697316]
We present a new neural ODE framework, PNODE, based on high-level discrete algorithmic differentiation.
We show that PNODE achieves the highest memory efficiency when compared with other reverse-accurate methods.
arXiv Detail & Related papers (2022-06-02T20:46:26Z) - Proximal Implicit ODE Solvers for Accelerating Learning Neural ODEs [16.516974867571175]
This paper considers learning neural ODEs using implicit ODE solvers of different orders leveraging proximal operators.
The proximal implicit solver guarantees superiority over explicit solvers in numerical stability and computational efficiency.
arXiv Detail & Related papers (2022-04-19T02:55:10Z) - Accelerating Neural ODEs Using Model Order Reduction [0.0]
We show that mathematical model order reduction methods can be used for compressing and accelerating Neural ODEs.
We implement our novel compression method by developing Neural ODEs that integrate the necessary subspace-projection and operations as layers of the neural network.
arXiv Detail & Related papers (2021-05-28T19:27:09Z) - Meta-Solver for Neural Ordinary Differential Equations [77.8918415523446]
We investigate how the variability in solvers' space can improve neural ODEs performance.
We show that the right choice of solver parameterization can significantly affect neural ODEs models in terms of robustness to adversarial attacks.
arXiv Detail & Related papers (2021-03-15T17:26:34Z) - Multipole Graph Neural Operator for Parametric Partial Differential
Equations [57.90284928158383]
One of the main challenges in using deep learning-based methods for simulating physical systems is formulating physics-based data.
We propose a novel multi-level graph neural network framework that captures interaction at all ranges with only linear complexity.
Experiments confirm our multi-graph network learns discretization-invariant solution operators to PDEs and can be evaluated in linear time.
arXiv Detail & Related papers (2020-06-16T21:56:22Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.