On Neural Differential Equations
- URL: http://arxiv.org/abs/2202.02435v1
- Date: Fri, 4 Feb 2022 23:32:29 GMT
- Title: On Neural Differential Equations
- Authors: Patrick Kidger
- Abstract summary: In particular, neural differential equations (NDEs) demonstrate that neural networks and differential equations are two sides of the same coin.
NDEs are suitable for tackling generative problems, dynamical systems, and time series.
NDEs offer high-capacity function approximation, strong priors on model space, the ability to handle irregular data, memory efficiency, and a wealth of available theory on both sides.
- Score: 13.503274710499971
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The conjoining of dynamical systems and deep learning has become a topic of
great interest. In particular, neural differential equations (NDEs) demonstrate
that neural networks and differential equation are two sides of the same coin.
Traditional parameterised differential equations are a special case. Many
popular neural network architectures, such as residual networks and recurrent
networks, are discretisations.
NDEs are suitable for tackling generative problems, dynamical systems, and
time series (particularly in physics, finance, ...) and are thus of interest to
both modern machine learning and traditional mathematical modelling. NDEs offer
high-capacity function approximation, strong priors on model space, the ability
to handle irregular data, memory efficiency, and a wealth of available theory
on both sides.
This doctoral thesis provides an in-depth survey of the field.
Topics include: neural ordinary differential equations (e.g. for hybrid
neural/mechanistic modelling of physical systems); neural controlled
differential equations (e.g. for learning functions of irregular time series);
and neural stochastic differential equations (e.g. to produce generative models
capable of representing complex stochastic dynamics, or sampling from complex
high-dimensional distributions).
Further topics include: numerical methods for NDEs (e.g. reversible
differential equations solvers, backpropagation through differential equations,
Brownian reconstruction); symbolic regression for dynamical systems (e.g. via
regularised evolution); and deep implicit models (e.g. deep equilibrium models,
differentiable optimisation).
We anticipate this thesis will be of interest to anyone interested in the
marriage of deep learning with dynamical systems, and hope it will provide a
useful reference for the current state of the art.
Related papers
- Neural Fractional Differential Equations [2.812395851874055]
Fractional Differential Equations (FDEs) are essential tools for modelling complex systems in science and engineering.
We propose the Neural FDE, a novel deep neural network architecture that adjusts a FDE to the dynamics of data.
arXiv Detail & Related papers (2024-03-05T07:45:29Z) - Fourier Neural Differential Equations for learning Quantum Field
Theories [57.11316818360655]
A Quantum Field Theory is defined by its interaction Hamiltonian, and linked to experimental data by the scattering matrix.
In this paper, NDE models are used to learn theory, Scalar-Yukawa theory and Scalar Quantum Electrodynamics.
The interaction Hamiltonian of a theory can be extracted from network parameters.
arXiv Detail & Related papers (2023-11-28T22:11:15Z) - Neural Laplace: Learning diverse classes of differential equations in
the Laplace domain [86.52703093858631]
We propose a unified framework for learning diverse classes of differential equations (DEs) including all the aforementioned ones.
Instead of modelling the dynamics in the time domain, we model it in the Laplace domain, where the history-dependencies and discontinuities in time can be represented as summations of complex exponentials.
In the experiments, Neural Laplace shows superior performance in modelling and extrapolating the trajectories of diverse classes of DEs.
arXiv Detail & Related papers (2022-06-10T02:14:59Z) - On the balance between the training time and interpretability of neural
ODE for time series modelling [77.34726150561087]
The paper shows that modern neural ODE cannot be reduced to simpler models for time-series modelling applications.
The complexity of neural ODE is compared to or exceeds the conventional time-series modelling tools.
We propose a new view on time-series modelling using combined neural networks and an ODE system approach.
arXiv Detail & Related papers (2022-06-07T13:49:40Z) - Physics Informed RNN-DCT Networks for Time-Dependent Partial
Differential Equations [62.81701992551728]
We present a physics-informed framework for solving time-dependent partial differential equations.
Our model utilizes discrete cosine transforms to encode spatial and recurrent neural networks.
We show experimental results on the Taylor-Green vortex solution to the Navier-Stokes equations.
arXiv Detail & Related papers (2022-02-24T20:46:52Z) - Climate Modeling with Neural Diffusion Equations [3.8521112392276]
We design a novel climate model based on the neural ordinary differential equation (NODE) and the diffusion equation.
Our method consistently outperforms existing baselines by non-trivial margins.
arXiv Detail & Related papers (2021-11-11T01:48:46Z) - HyperPINN: Learning parameterized differential equations with
physics-informed hypernetworks [32.095262903584725]
We propose the HyperPINN, which uses hypernetworks to learn to generate neural networks that can solve a differential equation from a given parameterization.
We demonstrate with experiments on both a PDE and an ODE that this type of model can lead to neural network solutions to differential equations that maintain a small size.
arXiv Detail & Related papers (2021-10-28T17:50:09Z) - Partial Differential Equations is All You Need for Generating Neural Architectures -- A Theory for Physical Artificial Intelligence Systems [40.20472268839781]
We generalize the reaction-diffusion equation in statistical physics, Schr"odinger equation in quantum mechanics, Helmholtz equation in paraxial optics.
We take finite difference method to discretize NPDE for finding numerical solution.
Basic building blocks of deep neural network architecture, including multi-layer perceptron, convolutional neural network and recurrent neural networks, are generated.
arXiv Detail & Related papers (2021-03-10T00:05:46Z) - Large-scale Neural Solvers for Partial Differential Equations [48.7576911714538]
Solving partial differential equations (PDE) is an indispensable part of many branches of science as many processes can be modelled in terms of PDEs.
Recent numerical solvers require manual discretization of the underlying equation as well as sophisticated, tailored code for distributed computing.
We examine the applicability of continuous, mesh-free neural solvers for partial differential equations, physics-informed neural networks (PINNs)
We discuss the accuracy of GatedPINN with respect to analytical solutions -- as well as state-of-the-art numerical solvers, such as spectral solvers.
arXiv Detail & Related papers (2020-09-08T13:26:51Z) - Liquid Time-constant Networks [117.57116214802504]
We introduce a new class of time-continuous recurrent neural network models.
Instead of declaring a learning system's dynamics by implicit nonlinearities, we construct networks of linear first-order dynamical systems.
These neural networks exhibit stable and bounded behavior, yield superior expressivity within the family of neural ordinary differential equations.
arXiv Detail & Related papers (2020-06-08T09:53:35Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.