Neural Integral Equations
- URL: http://arxiv.org/abs/2209.15190v4
- Date: Thu, 18 May 2023 22:45:20 GMT
- Title: Neural Integral Equations
- Authors: Emanuele Zappala, Antonio Henrique de Oliveira Fonseca, Josue Ortega
Caro and David van Dijk
- Abstract summary: We introduce Neural Integral Equations (NIE), a method that learns an unknown integral operator from data through an IE solver.
We also introduce Attentional Neural Integral Equations (ANIE), where the integral is replaced by self-attention, which improves scalability, capacity, and results in an interpretable model.
- Score: 2.485182034310304
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Integral equations (IEs) are equations that model spatiotemporal systems with
non-local interactions. They have found important applications throughout
theoretical and applied sciences, including in physics, chemistry, biology, and
engineering. While efficient algorithms exist for solving given IEs, no method
exists that can learn an IE and its associated dynamics from data alone. In
this paper, we introduce Neural Integral Equations (NIE), a method that learns
an unknown integral operator from data through an IE solver. We also introduce
Attentional Neural Integral Equations (ANIE), where the integral is replaced by
self-attention, which improves scalability, capacity, and results in an
interpretable model. We demonstrate that (A)NIE outperforms other methods in
both speed and accuracy on several benchmark tasks in ODE, PDE, and IE systems
of synthetic and real-world data.
Related papers
- Mechanistic Neural Networks for Scientific Machine Learning [58.99592521721158]
We present Mechanistic Neural Networks, a neural network design for machine learning applications in the sciences.
It incorporates a new Mechanistic Block in standard architectures to explicitly learn governing differential equations as representations.
Central to our approach is a novel Relaxed Linear Programming solver (NeuRLP) inspired by a technique that reduces solving linear ODEs to solving linear programs.
arXiv Detail & Related papers (2024-02-20T15:23:24Z) - Spectral methods for Neural Integral Equations [0.6993026261767287]
We introduce a framework for neural integral equations based on spectral methods.
We show various theoretical guarantees regarding the approximation capabilities of the model.
We provide numerical experiments to demonstrate the practical effectiveness of the resulting model.
arXiv Detail & Related papers (2023-12-09T19:42:36Z) - On Robust Numerical Solver for ODE via Self-Attention Mechanism [82.95493796476767]
We explore training efficient and robust AI-enhanced numerical solvers with a small data size by mitigating intrinsic noise disturbances.
We first analyze the ability of the self-attention mechanism to regulate noise in supervised learning and then propose a simple-yet-effective numerical solver, Attr, which introduces an additive self-attention mechanism to the numerical solution of differential equations.
arXiv Detail & Related papers (2023-02-05T01:39:21Z) - Neural Integro-Differential Equations [2.001149416674759]
Integro-Differential Equations (IDEs) are generalizations of differential equations that comprise both an integral and a differential component.
NIDE is a framework that models ordinary and integral components ofIDEs using neural networks.
We show that NIDE can decompose dynamics into its Markovian and non-Markovian constituents.
arXiv Detail & Related papers (2022-06-28T20:39:35Z) - D-CIPHER: Discovery of Closed-form Partial Differential Equations [80.46395274587098]
We propose D-CIPHER, which is robust to measurement artifacts and can uncover a new and very general class of differential equations.
We further design a novel optimization procedure, CoLLie, to help D-CIPHER search through this class efficiently.
arXiv Detail & Related papers (2022-06-21T17:59:20Z) - On Numerical Integration in Neural Ordinary Differential Equations [0.0]
We propose the inverse modified differential equations (IMDE) to clarify the influence of numerical integration on training Neural ODE models.
It is shown that training a Neural ODE model actually returns a close approximation of the IMDE, rather than the true ODE.
arXiv Detail & Related papers (2022-06-15T07:39:01Z) - Neural Laplace: Learning diverse classes of differential equations in
the Laplace domain [86.52703093858631]
We propose a unified framework for learning diverse classes of differential equations (DEs) including all the aforementioned ones.
Instead of modelling the dynamics in the time domain, we model it in the Laplace domain, where the history-dependencies and discontinuities in time can be represented as summations of complex exponentials.
In the experiments, Neural Laplace shows superior performance in modelling and extrapolating the trajectories of diverse classes of DEs.
arXiv Detail & Related papers (2022-06-10T02:14:59Z) - AutoIP: A United Framework to Integrate Physics into Gaussian Processes [15.108333340471034]
We propose a framework that can integrate all kinds of differential equations into Gaussian processes.
Our method shows improvement upon vanilla GPs in both simulation and several real-world applications.
arXiv Detail & Related papers (2022-02-24T19:02:14Z) - Message Passing Neural PDE Solvers [60.77761603258397]
We build a neural message passing solver, replacing allally designed components in the graph with backprop-optimized neural function approximators.
We show that neural message passing solvers representationally contain some classical methods, such as finite differences, finite volumes, and WENO schemes.
We validate our method on various fluid-like flow problems, demonstrating fast, stable, and accurate performance across different domain topologies, equation parameters, discretizations, etc., in 1D and 2D.
arXiv Detail & Related papers (2022-02-07T17:47:46Z) - Learning ODEs via Diffeomorphisms for Fast and Robust Integration [40.52862415144424]
Differentiable solvers are central for learning Neural ODEs.
We propose an alternative approach to learning ODEs from data.
We observe improvements of up to two orders of magnitude when integrating learned ODEs with gradient.
arXiv Detail & Related papers (2021-07-04T14:32:16Z) - Fourier Neural Operator for Parametric Partial Differential Equations [57.90284928158383]
We formulate a new neural operator by parameterizing the integral kernel directly in Fourier space.
We perform experiments on Burgers' equation, Darcy flow, and Navier-Stokes equation.
It is up to three orders of magnitude faster compared to traditional PDE solvers.
arXiv Detail & Related papers (2020-10-18T00:34:21Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.