Neural Operator: Graph Kernel Network for Partial Differential Equations
- URL: http://arxiv.org/abs/2003.03485v1
- Date: Sat, 7 Mar 2020 01:56:20 GMT
- Title: Neural Operator: Graph Kernel Network for Partial Differential Equations
- Authors: Zongyi Li, Nikola Kovachki, Kamyar Azizzadenesheli, Burigede Liu,
Kaushik Bhattacharya, Andrew Stuart, Anima Anandkumar
- Abstract summary: This work is to generalize neural networks so that they can learn mappings between infinite-dimensional spaces (operators)
We formulate approximation of the infinite-dimensional mapping by composing nonlinear activation functions and a class of integral operators.
Experiments confirm that the proposed graph kernel network does have the desired properties and show competitive performance compared to the state of the art solvers.
- Score: 57.90284928158383
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The classical development of neural networks has been primarily for mappings
between a finite-dimensional Euclidean space and a set of classes, or between
two finite-dimensional Euclidean spaces. The purpose of this work is to
generalize neural networks so that they can learn mappings between
infinite-dimensional spaces (operators). The key innovation in our work is that
a single set of network parameters, within a carefully designed network
architecture, may be used to describe mappings between infinite-dimensional
spaces and between different finite-dimensional approximations of those spaces.
We formulate approximation of the infinite-dimensional mapping by composing
nonlinear activation functions and a class of integral operators. The kernel
integration is computed by message passing on graph networks. This approach has
substantial practical consequences which we will illustrate in the context of
mappings between input data to partial differential equations (PDEs) and their
solutions. In this context, such learned networks can generalize among
different approximation methods for the PDE (such as finite difference or
finite element methods) and among approximations corresponding to different
underlying levels of resolution and discretization. Experiments confirm that
the proposed graph kernel network does have the desired properties and show
competitive performance compared to the state of the art solvers.
Related papers
- Operator Learning with Neural Fields: Tackling PDEs on General
Geometries [15.65577053925333]
Machine learning approaches for solving partial differential equations require learning mappings between function spaces.
New CORAL method leverages coordinate-based networks for PDEs on some general constraints.
arXiv Detail & Related papers (2023-06-12T17:52:39Z) - Predictions Based on Pixel Data: Insights from PDEs and Finite Differences [0.0]
This paper deals with approximation of time sequences where each observation is a matrix.
We show that with relatively small networks, we can represent exactly a class of numerical discretizations of PDEs based on the method of lines.
Our network architecture is inspired by those typically adopted in the approximation of time sequences.
arXiv Detail & Related papers (2023-05-01T08:54:45Z) - Solving High-Dimensional PDEs with Latent Spectral Models [74.1011309005488]
We present Latent Spectral Models (LSM) toward an efficient and precise solver for high-dimensional PDEs.
Inspired by classical spectral methods in numerical analysis, we design a neural spectral block to solve PDEs in the latent space.
LSM achieves consistent state-of-the-art and yields a relative gain of 11.5% averaged on seven benchmarks.
arXiv Detail & Related papers (2023-01-30T04:58:40Z) - Discretization Invariant Networks for Learning Maps between Neural
Fields [3.09125960098955]
We present a new framework for understanding and designing discretization invariant neural networks (DI-Nets)
Our analysis establishes upper bounds on the deviation in model outputs under different finite discretizations.
We prove by construction that DI-Nets universally approximate a large class of maps between integrable function spaces.
arXiv Detail & Related papers (2022-06-02T17:44:03Z) - A singular Riemannian geometry approach to Deep Neural Networks II.
Reconstruction of 1-D equivalence classes [78.120734120667]
We build the preimage of a point in the output manifold in the input space.
We focus for simplicity on the case of neural networks maps from n-dimensional real spaces to (n - 1)-dimensional real spaces.
arXiv Detail & Related papers (2021-12-17T11:47:45Z) - Neural Operator: Learning Maps Between Function Spaces [75.93843876663128]
We propose a generalization of neural networks to learn operators, termed neural operators, that map between infinite dimensional function spaces.
We prove a universal approximation theorem for our proposed neural operator, showing that it can approximate any given nonlinear continuous operator.
An important application for neural operators is learning surrogate maps for the solution operators of partial differential equations.
arXiv Detail & Related papers (2021-08-19T03:56:49Z) - Fourier Neural Operator for Parametric Partial Differential Equations [57.90284928158383]
We formulate a new neural operator by parameterizing the integral kernel directly in Fourier space.
We perform experiments on Burgers' equation, Darcy flow, and Navier-Stokes equation.
It is up to three orders of magnitude faster compared to traditional PDE solvers.
arXiv Detail & Related papers (2020-10-18T00:34:21Z) - Multipole Graph Neural Operator for Parametric Partial Differential
Equations [57.90284928158383]
One of the main challenges in using deep learning-based methods for simulating physical systems is formulating physics-based data.
We propose a novel multi-level graph neural network framework that captures interaction at all ranges with only linear complexity.
Experiments confirm our multi-graph network learns discretization-invariant solution operators to PDEs and can be evaluated in linear time.
arXiv Detail & Related papers (2020-06-16T21:56:22Z) - Model Reduction and Neural Networks for Parametric PDEs [9.405458160620533]
We develop a framework for data-driven approximation of input-output maps between infinite-dimensional spaces.
The proposed approach is motivated by the recent successes of neural networks and deep learning.
For a class of input-output maps, and suitably chosen probability measures on the inputs, we prove convergence of the proposed approximation methodology.
arXiv Detail & Related papers (2020-05-07T00:09:27Z) - Solving inverse-PDE problems with physics-aware neural networks [0.0]
We propose a novel framework to find unknown fields in the context of inverse problems for partial differential equations.
We blend the high expressibility of deep neural networks as universal function estimators with the accuracy and reliability of existing numerical algorithms.
arXiv Detail & Related papers (2020-01-10T18:46:50Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.