Neural Integral Equations
- URL: http://arxiv.org/abs/2209.15190v4
- Date: Thu, 18 May 2023 22:45:20 GMT
- Title: Neural Integral Equations
- Authors: Emanuele Zappala, Antonio Henrique de Oliveira Fonseca, Josue Ortega
Caro and David van Dijk
- Abstract summary: We introduce Neural Integral Equations (NIE), a method that learns an unknown integral operator from data through an IE solver.
We also introduce Attentional Neural Integral Equations (ANIE), where the integral is replaced by self-attention, which improves scalability, capacity, and results in an interpretable model.
- Score: 2.485182034310304
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Integral equations (IEs) are equations that model spatiotemporal systems with
non-local interactions. They have found important applications throughout
theoretical and applied sciences, including in physics, chemistry, biology, and
engineering. While efficient algorithms exist for solving given IEs, no method
exists that can learn an IE and its associated dynamics from data alone. In
this paper, we introduce Neural Integral Equations (NIE), a method that learns
an unknown integral operator from data through an IE solver. We also introduce
Attentional Neural Integral Equations (ANIE), where the integral is replaced by
self-attention, which improves scalability, capacity, and results in an
interpretable model. We demonstrate that (A)NIE outperforms other methods in
both speed and accuracy on several benchmark tasks in ODE, PDE, and IE systems
of synthetic and real-world data.
Related papers
- Mathematical artificial data for operator learning [1.4579344926652846]
We present the Mathematical Artificial Data (MAD) framework, a new paradigm that integrates physical laws with data-driven learning to facilitate large-scale operator discovery.<n>We show MAD's generalizability and superior efficiency/accuracy across various differential equations scenarios.
arXiv Detail & Related papers (2025-07-09T11:23:05Z) - DimOL: Dimensional Awareness as A New 'Dimension' in Operator Learning [63.5925701087252]
We introduce DimOL (Dimension-aware Operator Learning), drawing insights from dimensional analysis.
To implement DimOL, we propose the ProdLayer, which can be seamlessly integrated into FNO-based and Transformer-based PDE solvers.
Empirically, DimOL models achieve up to 48% performance gain within the PDE datasets.
arXiv Detail & Related papers (2024-10-08T10:48:50Z) - PICL: Physics Informed Contrastive Learning for Partial Differential Equations [7.136205674624813]
We develop a novel contrastive pretraining framework that improves neural operator generalization across multiple governing equations simultaneously.
A combination of physics-informed system evolution and latent-space model output are anchored to input data and used in our distance function.
We find that physics-informed contrastive pretraining improves accuracy for the Fourier Neural Operator in fixed-future and autoregressive rollout tasks for the 1D and 2D Heat, Burgers', and linear advection equations.
arXiv Detail & Related papers (2024-01-29T17:32:22Z) - Spectral methods for Neural Integral Equations [0.6993026261767287]
We introduce a framework for neural integral equations based on spectral methods.
We show various theoretical guarantees regarding the approximation capabilities of the model.
We provide numerical experiments to demonstrate the practical effectiveness of the resulting model.
arXiv Detail & Related papers (2023-12-09T19:42:36Z) - Learning nonlinear integral operators via Recurrent Neural Networks and
its application in solving Integro-Differential Equations [4.011446845089061]
We learn and represent nonlinear integral operators that appear in nonlinear integro-differential equations (IDEs)
The LSTM-RNN representation of the nonlinear integral operator allows us to turn a system of nonlinear integro-differential equations into a system of ordinary differential equations.
We show how this methodology can effectively solve the Dyson's equation for quantum many-body systems.
arXiv Detail & Related papers (2023-10-13T22:57:46Z) - CoNO: Complex Neural Operator for Continuous Dynamical Systems [10.326780211731263]
We introduce a Complex Neural Operator (CoNO) that parameterizes the integral kernel in the complex fractional Fourier domain.
We show that the model effectively captures the underlying partial differential equation with a single complex fractional Fourier transform.
arXiv Detail & Related papers (2023-10-03T14:38:12Z) - Neural Operators for Accelerating Scientific Simulations and Design [85.89660065887956]
An AI framework, known as Neural Operators, presents a principled framework for learning mappings between functions defined on continuous domains.
Neural Operators can augment or even replace existing simulators in many applications, such as computational fluid dynamics, weather forecasting, and material modeling.
arXiv Detail & Related papers (2023-09-27T00:12:07Z) - On Robust Numerical Solver for ODE via Self-Attention Mechanism [82.95493796476767]
We explore training efficient and robust AI-enhanced numerical solvers with a small data size by mitigating intrinsic noise disturbances.
We first analyze the ability of the self-attention mechanism to regulate noise in supervised learning and then propose a simple-yet-effective numerical solver, Attr, which introduces an additive self-attention mechanism to the numerical solution of differential equations.
arXiv Detail & Related papers (2023-02-05T01:39:21Z) - Semi-supervised Learning of Partial Differential Operators and Dynamical
Flows [68.77595310155365]
We present a novel method that combines a hyper-network solver with a Fourier Neural Operator architecture.
We test our method on various time evolution PDEs, including nonlinear fluid flows in one, two, and three spatial dimensions.
The results show that the new method improves the learning accuracy at the time point of supervision point, and is able to interpolate and the solutions to any intermediate time.
arXiv Detail & Related papers (2022-07-28T19:59:14Z) - Neural Integro-Differential Equations [2.001149416674759]
Integro-Differential Equations (IDEs) are generalizations of differential equations that comprise both an integral and a differential component.
NIDE is a framework that models ordinary and integral components ofIDEs using neural networks.
We show that NIDE can decompose dynamics into its Markovian and non-Markovian constituents.
arXiv Detail & Related papers (2022-06-28T20:39:35Z) - Neural Laplace: Learning diverse classes of differential equations in
the Laplace domain [86.52703093858631]
We propose a unified framework for learning diverse classes of differential equations (DEs) including all the aforementioned ones.
Instead of modelling the dynamics in the time domain, we model it in the Laplace domain, where the history-dependencies and discontinuities in time can be represented as summations of complex exponentials.
In the experiments, Neural Laplace shows superior performance in modelling and extrapolating the trajectories of diverse classes of DEs.
arXiv Detail & Related papers (2022-06-10T02:14:59Z) - Fourier Neural Operator for Parametric Partial Differential Equations [57.90284928158383]
We formulate a new neural operator by parameterizing the integral kernel directly in Fourier space.
We perform experiments on Burgers' equation, Darcy flow, and Navier-Stokes equation.
It is up to three orders of magnitude faster compared to traditional PDE solvers.
arXiv Detail & Related papers (2020-10-18T00:34:21Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.