Neural Laplace: Learning diverse classes of differential equations in
the Laplace domain
- URL: http://arxiv.org/abs/2206.04843v3
- Date: Tue, 14 Jun 2022 09:48:45 GMT
- Title: Neural Laplace: Learning diverse classes of differential equations in
the Laplace domain
- Authors: Samuel Holt, Zhaozhi Qian, Mihaela van der Schaar
- Abstract summary: We propose a unified framework for learning diverse classes of differential equations (DEs) including all the aforementioned ones.
Instead of modelling the dynamics in the time domain, we model it in the Laplace domain, where the history-dependencies and discontinuities in time can be represented as summations of complex exponentials.
In the experiments, Neural Laplace shows superior performance in modelling and extrapolating the trajectories of diverse classes of DEs.
- Score: 86.52703093858631
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Neural Ordinary Differential Equations model dynamical systems with ODEs
learned by neural networks. However, ODEs are fundamentally inadequate to model
systems with long-range dependencies or discontinuities, which are common in
engineering and biological systems. Broader classes of differential equations
(DE) have been proposed as remedies, including delay differential equations and
integro-differential equations. Furthermore, Neural ODE suffers from numerical
instability when modelling stiff ODEs and ODEs with piecewise forcing
functions. In this work, we propose Neural Laplace, a unified framework for
learning diverse classes of DEs including all the aforementioned ones. Instead
of modelling the dynamics in the time domain, we model it in the Laplace
domain, where the history-dependencies and discontinuities in time can be
represented as summations of complex exponentials. To make learning more
efficient, we use the geometrical stereographic map of a Riemann sphere to
induce more smoothness in the Laplace domain. In the experiments, Neural
Laplace shows superior performance in modelling and extrapolating the
trajectories of diverse classes of DEs, including the ones with complex history
dependency and abrupt changes.
Related papers
- Neural Laplace for learning Stochastic Differential Equations [0.0]
Neuralplace is a unified framework for learning diverse classes of differential equations (DE)
For different classes of DE, this framework outperforms other approaches relying on neural networks that aim to learn classes of ordinary differential equations (ODE)
arXiv Detail & Related papers (2024-06-07T14:29:30Z) - Neural Fractional Differential Equations [2.812395851874055]
Fractional Differential Equations (FDEs) are essential tools for modelling complex systems in science and engineering.
We propose the Neural FDE, a novel deep neural network architecture that adjusts a FDE to the dynamics of data.
arXiv Detail & Related papers (2024-03-05T07:45:29Z) - On Numerical Integration in Neural Ordinary Differential Equations [0.0]
We propose the inverse modified differential equations (IMDE) to clarify the influence of numerical integration on training Neural ODE models.
It is shown that training a Neural ODE model actually returns a close approximation of the IMDE, rather than the true ODE.
arXiv Detail & Related papers (2022-06-15T07:39:01Z) - Physics Informed RNN-DCT Networks for Time-Dependent Partial
Differential Equations [62.81701992551728]
We present a physics-informed framework for solving time-dependent partial differential equations.
Our model utilizes discrete cosine transforms to encode spatial and recurrent neural networks.
We show experimental results on the Taylor-Green vortex solution to the Navier-Stokes equations.
arXiv Detail & Related papers (2022-02-24T20:46:52Z) - Message Passing Neural PDE Solvers [60.77761603258397]
We build a neural message passing solver, replacing allally designed components in the graph with backprop-optimized neural function approximators.
We show that neural message passing solvers representationally contain some classical methods, such as finite differences, finite volumes, and WENO schemes.
We validate our method on various fluid-like flow problems, demonstrating fast, stable, and accurate performance across different domain topologies, equation parameters, discretizations, etc., in 1D and 2D.
arXiv Detail & Related papers (2022-02-07T17:47:46Z) - On Neural Differential Equations [13.503274710499971]
In particular, neural differential equations (NDEs) demonstrate that neural networks and differential equations are two sides of the same coin.
NDEs are suitable for tackling generative problems, dynamical systems, and time series.
NDEs offer high-capacity function approximation, strong priors on model space, the ability to handle irregular data, memory efficiency, and a wealth of available theory on both sides.
arXiv Detail & Related papers (2022-02-04T23:32:29Z) - Stiff Neural Ordinary Differential Equations [0.0]
We first show the challenges of learning neural ODE in the classical stiff ODE systems of Robertson's problem.
We then present successful demonstrations in stiff systems of Robertson's problem and an air pollution problem.
arXiv Detail & Related papers (2021-03-29T05:24:56Z) - Fourier Neural Operator for Parametric Partial Differential Equations [57.90284928158383]
We formulate a new neural operator by parameterizing the integral kernel directly in Fourier space.
We perform experiments on Burgers' equation, Darcy flow, and Navier-Stokes equation.
It is up to three orders of magnitude faster compared to traditional PDE solvers.
arXiv Detail & Related papers (2020-10-18T00:34:21Z) - Multipole Graph Neural Operator for Parametric Partial Differential
Equations [57.90284928158383]
One of the main challenges in using deep learning-based methods for simulating physical systems is formulating physics-based data.
We propose a novel multi-level graph neural network framework that captures interaction at all ranges with only linear complexity.
Experiments confirm our multi-graph network learns discretization-invariant solution operators to PDEs and can be evaluated in linear time.
arXiv Detail & Related papers (2020-06-16T21:56:22Z) - Stochasticity in Neural ODEs: An Empirical Study [68.8204255655161]
Regularization of neural networks (e.g. dropout) is a widespread technique in deep learning that allows for better generalization.
We show that data augmentation during the training improves the performance of both deterministic and versions of the same model.
However, the improvements obtained by the data augmentation completely eliminate the empirical regularization gains, making the performance of neural ODE and neural SDE negligible.
arXiv Detail & Related papers (2020-02-22T22:12:56Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.