MIONet: Learning multiple-input operators via tensor product
- URL: http://arxiv.org/abs/2202.06137v1
- Date: Sat, 12 Feb 2022 20:37:04 GMT
- Title: MIONet: Learning multiple-input operators via tensor product
- Authors: Pengzhan Jin, Shuai Meng, Lu Lu
- Abstract summary: We study the operator regression via neural networks for multiple-input operators defined on the product of Banach spaces.
Based on our theory and a low-rank approximation, we propose a novel neural operator, MIONet, to learn multiple-input operators.
- Score: 2.5426761219054312
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: As an emerging paradigm in scientific machine learning, neural operators aim
to learn operators, via neural networks, that map between infinite-dimensional
function spaces. Several neural operators have been recently developed.
However, all the existing neural operators are only designed to learn operators
defined on a single Banach space, i.e., the input of the operator is a single
function. Here, for the first time, we study the operator regression via neural
networks for multiple-input operators defined on the product of Banach spaces.
We first prove a universal approximation theorem of continuous multiple-input
operators. We also provide detailed theoretical analysis including the
approximation error, which provides a guidance of the design of the network
architecture. Based on our theory and a low-rank approximation, we propose a
novel neural operator, MIONet, to learn multiple-input operators. MIONet
consists of several branch nets for encoding the input functions and a trunk
net for encoding the domain of the output function. We demonstrate that MIONet
can learn solution operators involving systems governed by ordinary and partial
differential equations. In our computational examples, we also show that we can
endow MIONet with prior knowledge of the underlying system, such as linearity
and periodicity, to further improve the accuracy.
Related papers
- Learning Partial Differential Equations with Deep Parallel Neural Operators [11.121415128908566]
A novel methodology is to learn an operator as a means of approximating the mapping between outputs.
In practical physical science problems, the numerical solutions of partial differential equations are complex.
We propose a deep parallel operator model (DPNO) for efficiently and accurately solving partial differential equations.
arXiv Detail & Related papers (2024-09-30T06:04:04Z) - Neural Operators with Localized Integral and Differential Kernels [77.76991758980003]
We present a principled approach to operator learning that can capture local features under two frameworks.
We prove that we obtain differential operators under an appropriate scaling of the kernel values of CNNs.
To obtain local integral operators, we utilize suitable basis representations for the kernels based on discrete-continuous convolutions.
arXiv Detail & Related papers (2024-02-26T18:59:31Z) - In-Context Operator Learning with Data Prompts for Differential Equation
Problems [12.61842281581773]
This paper introduces a new neural-network-based approach, namely In-Context Operator Networks (ICON)
ICON simultaneously learn operators from the prompted data and apply it to new questions during the inference stage, without any weight update.
Our numerical results show the neural network's capability as a few-shot operator learner for a diversified type of differential equation problems.
arXiv Detail & Related papers (2023-04-17T05:22:26Z) - Join-Chain Network: A Logical Reasoning View of the Multi-head Attention
in Transformer [59.73454783958702]
We propose a symbolic reasoning architecture that chains many join operators together to model output logical expressions.
In particular, we demonstrate that such an ensemble of join-chains can express a broad subset of ''tree-structured'' first-order logical expressions, named FOET.
We find that the widely used multi-head self-attention module in transformer can be understood as a special neural operator that implements the union bound of the join operator in probabilistic predicate space.
arXiv Detail & Related papers (2022-10-06T07:39:58Z) - Physics-Informed Neural Operators [3.9181541460605116]
Neural networks can approximate general nonlinear operators, represented either explicitly by a combination of mathematical operators, e.g., in an advection-diffusion-reaction partial differential equation.
The first neural operator was the Deep Operator Network (DeepONet), proposed in 2019 based on rigorous approximation theory.
For black box systems, training of neural operators is data-driven only but if the governing equations are known they can be incorporated into the loss function during training to develop physics-informed neural operators.
arXiv Detail & Related papers (2022-07-08T12:29:09Z) - Pseudo-Differential Neural Operator: Generalized Fourier Neural Operator
for Learning Solution Operators of Partial Differential Equations [14.43135909469058]
We propose a novel textitpseudo-differential integral operator (PDIO) to analyze and generalize the Fourier integral operator in FNO.
We experimentally validate the effectiveness of the proposed model by utilizing Darcy flow and the Navier-Stokes equation.
arXiv Detail & Related papers (2022-01-28T07:22:32Z) - Dynamic Inference with Neural Interpreters [72.90231306252007]
We present Neural Interpreters, an architecture that factorizes inference in a self-attention network as a system of modules.
inputs to the model are routed through a sequence of functions in a way that is end-to-end learned.
We show that Neural Interpreters perform on par with the vision transformer using fewer parameters, while being transferrable to a new task in a sample efficient manner.
arXiv Detail & Related papers (2021-10-12T23:22:45Z) - Neural Operator: Learning Maps Between Function Spaces [75.93843876663128]
We propose a generalization of neural networks to learn operators, termed neural operators, that map between infinite dimensional function spaces.
We prove a universal approximation theorem for our proposed neural operator, showing that it can approximate any given nonlinear continuous operator.
An important application for neural operators is learning surrogate maps for the solution operators of partial differential equations.
arXiv Detail & Related papers (2021-08-19T03:56:49Z) - Fourier Neural Operator for Parametric Partial Differential Equations [57.90284928158383]
We formulate a new neural operator by parameterizing the integral kernel directly in Fourier space.
We perform experiments on Burgers' equation, Darcy flow, and Navier-Stokes equation.
It is up to three orders of magnitude faster compared to traditional PDE solvers.
arXiv Detail & Related papers (2020-10-18T00:34:21Z) - Neural Arithmetic Units [84.65228064780744]
Neural networks can approximate complex functions, but they struggle to perform exact arithmetic operations over real numbers.
We present two new neural network components: the Neural Addition Unit (NAU), which can learn exact addition and subtraction, and the Neural multiplication Unit (NMU), which can multiply subsets of a vector.
Our proposed units NAU and NMU, compared with previous neural units, converge more consistently, have fewer parameters, learn faster, can converge for larger hidden sizes, obtain sparse and meaningful weights, and can extrapolate to negative and small values.
arXiv Detail & Related papers (2020-01-14T19:35:04Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.