Physics-Informed Neural Operators
- URL: http://arxiv.org/abs/2207.05748v1
- Date: Fri, 8 Jul 2022 12:29:09 GMT
- Title: Physics-Informed Neural Operators
- Authors: Somdatta Goswami, Aniruddha Bora, Yue Yu, and George Em Karniadakis
- Abstract summary: Neural networks can approximate general nonlinear operators, represented either explicitly by a combination of mathematical operators, e.g., in an advection-diffusion-reaction partial differential equation.
The first neural operator was the Deep Operator Network (DeepONet), proposed in 2019 based on rigorous approximation theory.
For black box systems, training of neural operators is data-driven only but if the governing equations are known they can be incorporated into the loss function during training to develop physics-informed neural operators.
- Score: 3.9181541460605116
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Standard neural networks can approximate general nonlinear operators,
represented either explicitly by a combination of mathematical operators, e.g.,
in an advection-diffusion-reaction partial differential equation, or simply as
a black box, e.g., a system-of-systems. The first neural operator was the Deep
Operator Network (DeepONet), proposed in 2019 based on rigorous approximation
theory. Since then, a few other less general operators have been published,
e.g., based on graph neural networks or Fourier transforms. For black box
systems, training of neural operators is data-driven only but if the governing
equations are known they can be incorporated into the loss function during
training to develop physics-informed neural operators. Neural operators can be
used as surrogates in design problems, uncertainty quantification, autonomous
systems, and almost in any application requiring real-time inference. Moreover,
independently pre-trained DeepONets can be used as components of a complex
multi-physics system by coupling them together with relatively light training.
Here, we present a review of DeepONet, the Fourier neural operator, and the
graph neural operator, as well as appropriate extensions with feature
expansions, and highlight their usefulness in diverse applications in
computational mechanics, including porous media, fluid mechanics, and solid
mechanics.
Related papers
- DimOL: Dimensional Awareness as A New 'Dimension' in Operator Learning [63.5925701087252]
We introduce DimOL (Dimension-aware Operator Learning), drawing insights from dimensional analysis.
To implement DimOL, we propose the ProdLayer, which can be seamlessly integrated into FNO-based and Transformer-based PDE solvers.
Empirically, DimOL models achieve up to 48% performance gain within the PDE datasets.
arXiv Detail & Related papers (2024-10-08T10:48:50Z) - Linearization Turns Neural Operators into Function-Valued Gaussian Processes [23.85470417458593]
We introduce a new framework for approximate Bayesian uncertainty quantification in neural operators.
Our approach can be interpreted as a probabilistic analogue of the concept of currying from functional programming.
We showcase the efficacy of our approach through applications to different types of partial differential equations.
arXiv Detail & Related papers (2024-06-07T16:43:54Z) - Neural Operators with Localized Integral and Differential Kernels [77.76991758980003]
We present a principled approach to operator learning that can capture local features under two frameworks.
We prove that we obtain differential operators under an appropriate scaling of the kernel values of CNNs.
To obtain local integral operators, we utilize suitable basis representations for the kernels based on discrete-continuous convolutions.
arXiv Detail & Related papers (2024-02-26T18:59:31Z) - Neural Operators for Accelerating Scientific Simulations and Design [85.89660065887956]
An AI framework, known as Neural Operators, presents a principled framework for learning mappings between functions defined on continuous domains.
Neural Operators can augment or even replace existing simulators in many applications, such as computational fluid dynamics, weather forecasting, and material modeling.
arXiv Detail & Related papers (2023-09-27T00:12:07Z) - Scattering with Neural Operators [0.0]
Recent advances in machine learning establish the ability of certain neural-network architectures to approximate maps between function spaces.
Motivated by a prospect of employing them in fundamental physics, we examine applications to scattering processes in quantum mechanics.
arXiv Detail & Related papers (2023-08-28T18:00:00Z) - In-Context Operator Learning with Data Prompts for Differential Equation
Problems [12.61842281581773]
This paper introduces a new neural-network-based approach, namely In-Context Operator Networks (ICON)
ICON simultaneously learn operators from the prompted data and apply it to new questions during the inference stage, without any weight update.
Our numerical results show the neural network's capability as a few-shot operator learner for a diversified type of differential equation problems.
arXiv Detail & Related papers (2023-04-17T05:22:26Z) - Neural networks trained with SGD learn distributions of increasing
complexity [78.30235086565388]
We show that neural networks trained using gradient descent initially classify their inputs using lower-order input statistics.
We then exploit higher-order statistics only later during training.
We discuss the relation of DSB to other simplicity biases and consider its implications for the principle of universality in learning.
arXiv Detail & Related papers (2022-11-21T15:27:22Z) - MIONet: Learning multiple-input operators via tensor product [2.5426761219054312]
We study the operator regression via neural networks for multiple-input operators defined on the product of Banach spaces.
Based on our theory and a low-rank approximation, we propose a novel neural operator, MIONet, to learn multiple-input operators.
arXiv Detail & Related papers (2022-02-12T20:37:04Z) - Dynamic Inference with Neural Interpreters [72.90231306252007]
We present Neural Interpreters, an architecture that factorizes inference in a self-attention network as a system of modules.
inputs to the model are routed through a sequence of functions in a way that is end-to-end learned.
We show that Neural Interpreters perform on par with the vision transformer using fewer parameters, while being transferrable to a new task in a sample efficient manner.
arXiv Detail & Related papers (2021-10-12T23:22:45Z) - Neural Operator: Learning Maps Between Function Spaces [75.93843876663128]
We propose a generalization of neural networks to learn operators, termed neural operators, that map between infinite dimensional function spaces.
We prove a universal approximation theorem for our proposed neural operator, showing that it can approximate any given nonlinear continuous operator.
An important application for neural operators is learning surrogate maps for the solution operators of partial differential equations.
arXiv Detail & Related papers (2021-08-19T03:56:49Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.