Fourier Neural Operator for Parametric Partial Differential Equations
- URL: http://arxiv.org/abs/2010.08895v3
- Date: Mon, 17 May 2021 03:12:33 GMT
- Title: Fourier Neural Operator for Parametric Partial Differential Equations
- Authors: Zongyi Li, Nikola Kovachki, Kamyar Azizzadenesheli, Burigede Liu,
Kaushik Bhattacharya, Andrew Stuart, Anima Anandkumar
- Abstract summary: We formulate a new neural operator by parameterizing the integral kernel directly in Fourier space.
We perform experiments on Burgers' equation, Darcy flow, and Navier-Stokes equation.
It is up to three orders of magnitude faster compared to traditional PDE solvers.
- Score: 57.90284928158383
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The classical development of neural networks has primarily focused on
learning mappings between finite-dimensional Euclidean spaces. Recently, this
has been generalized to neural operators that learn mappings between function
spaces. For partial differential equations (PDEs), neural operators directly
learn the mapping from any functional parametric dependence to the solution.
Thus, they learn an entire family of PDEs, in contrast to classical methods
which solve one instance of the equation. In this work, we formulate a new
neural operator by parameterizing the integral kernel directly in Fourier
space, allowing for an expressive and efficient architecture. We perform
experiments on Burgers' equation, Darcy flow, and Navier-Stokes equation. The
Fourier neural operator is the first ML-based method to successfully model
turbulent flows with zero-shot super-resolution. It is up to three orders of
magnitude faster compared to traditional PDE solvers. Additionally, it achieves
superior accuracy compared to previous learning-based solvers under fixed
resolution.
Related papers
- Learning Partial Differential Equations with Deep Parallel Neural Operators [11.121415128908566]
A novel methodology is to learn an operator as a means of approximating the mapping between outputs.
In practical physical science problems, the numerical solutions of partial differential equations are complex.
We propose a deep parallel operator model (DPNO) for efficiently and accurately solving partial differential equations.
arXiv Detail & Related papers (2024-09-30T06:04:04Z) - Neural Operators with Localized Integral and Differential Kernels [77.76991758980003]
We present a principled approach to operator learning that can capture local features under two frameworks.
We prove that we obtain differential operators under an appropriate scaling of the kernel values of CNNs.
To obtain local integral operators, we utilize suitable basis representations for the kernels based on discrete-continuous convolutions.
arXiv Detail & Related papers (2024-02-26T18:59:31Z) - PICL: Physics Informed Contrastive Learning for Partial Differential Equations [7.136205674624813]
We develop a novel contrastive pretraining framework that improves neural operator generalization across multiple governing equations simultaneously.
A combination of physics-informed system evolution and latent-space model output are anchored to input data and used in our distance function.
We find that physics-informed contrastive pretraining improves accuracy for the Fourier Neural Operator in fixed-future and autoregressive rollout tasks for the 1D and 2D Heat, Burgers', and linear advection equations.
arXiv Detail & Related papers (2024-01-29T17:32:22Z) - Operator-learning-inspired Modeling of Neural Ordinary Differential
Equations [38.17903151426809]
We present a neural operator-based method to define the time-derivative term.
In our experiments with general downstream tasks, our method significantly outperforms existing methods.
arXiv Detail & Related papers (2023-12-16T00:29:15Z) - Hyena Neural Operator for Partial Differential Equations [9.438207505148947]
Recent advances in deep learning have provided a new approach to solving partial differential equations that involves the use of neural operators.
This study utilizes a neural operator called Hyena, which employs a long convolutional filter that is parameterized by a multilayer perceptron.
Our findings indicate Hyena can serve as an efficient and accurate model for partial learning differential equations solution operator.
arXiv Detail & Related papers (2023-06-28T19:45:45Z) - Neural Basis Functions for Accelerating Solutions to High Mach Euler
Equations [63.8376359764052]
We propose an approach to solving partial differential equations (PDEs) using a set of neural networks.
We regress a set of neural networks onto a reduced order Proper Orthogonal Decomposition (POD) basis.
These networks are then used in combination with a branch network that ingests the parameters of the prescribed PDE to compute a reduced order approximation to the PDE.
arXiv Detail & Related papers (2022-08-02T18:27:13Z) - Neural Laplace: Learning diverse classes of differential equations in
the Laplace domain [86.52703093858631]
We propose a unified framework for learning diverse classes of differential equations (DEs) including all the aforementioned ones.
Instead of modelling the dynamics in the time domain, we model it in the Laplace domain, where the history-dependencies and discontinuities in time can be represented as summations of complex exponentials.
In the experiments, Neural Laplace shows superior performance in modelling and extrapolating the trajectories of diverse classes of DEs.
arXiv Detail & Related papers (2022-06-10T02:14:59Z) - Message Passing Neural PDE Solvers [60.77761603258397]
We build a neural message passing solver, replacing allally designed components in the graph with backprop-optimized neural function approximators.
We show that neural message passing solvers representationally contain some classical methods, such as finite differences, finite volumes, and WENO schemes.
We validate our method on various fluid-like flow problems, demonstrating fast, stable, and accurate performance across different domain topologies, equation parameters, discretizations, etc., in 1D and 2D.
arXiv Detail & Related papers (2022-02-07T17:47:46Z) - Neural Operator: Learning Maps Between Function Spaces [75.93843876663128]
We propose a generalization of neural networks to learn operators, termed neural operators, that map between infinite dimensional function spaces.
We prove a universal approximation theorem for our proposed neural operator, showing that it can approximate any given nonlinear continuous operator.
An important application for neural operators is learning surrogate maps for the solution operators of partial differential equations.
arXiv Detail & Related papers (2021-08-19T03:56:49Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.