Bounding The Rademacher Complexity of Fourier Neural Operator
- URL: http://arxiv.org/abs/2209.05150v1
- Date: Mon, 12 Sep 2022 11:11:43 GMT
- Title: Bounding The Rademacher Complexity of Fourier Neural Operator
- Authors: Taeyoung Kim and Myungjoo Kang
- Abstract summary: A Fourier neural operator (FNO) is one of the physics-inspired machine learning methods.
In this study, we investigated the bounding of the Rademacher complexity of the FNO based on specific group norms.
In addition, we investigated the correlation between the empirical generalization error and the proposed capacity of FNO.
- Score: 3.4960814625958787
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: A Fourier neural operator (FNO) is one of the physics-inspired machine
learning methods. In particular, it is a neural operator. In recent times,
several types of neural operators have been developed, e.g., deep operator
networks, GNO, and MWTO. Compared with other models, the FNO is computationally
efficient and can learn nonlinear operators between function spaces independent
of a certain finite basis. In this study, we investigated the bounding of the
Rademacher complexity of the FNO based on specific group norms. Using capacity
based on these norms, we bound the generalization error of the FNO model. In
addition, we investigated the correlation between the empirical generalization
error and the proposed capacity of FNO. Based on this investigation, we gained
insight into the impact of the model architecture on the generalization error
and estimated the amount of information about FNO models stored in various
types of capacities.
Related papers
- Operator-learning-inspired Modeling of Neural Ordinary Differential
Equations [38.17903151426809]
We present a neural operator-based method to define the time-derivative term.
In our experiments with general downstream tasks, our method significantly outperforms existing methods.
arXiv Detail & Related papers (2023-12-16T00:29:15Z) - Benign Overfitting in Deep Neural Networks under Lazy Training [72.28294823115502]
We show that when the data distribution is well-separated, DNNs can achieve Bayes-optimal test error for classification.
Our results indicate that interpolating with smoother functions leads to better generalization.
arXiv Detail & Related papers (2023-05-30T19:37:44Z) - Domain Agnostic Fourier Neural Operators [15.29112632863168]
We introduce domain agnostic Fourier neural operator (DAFNO) for learning surrogates with irregular geometries and evolving domains.
The key idea is to incorporate a smoothed characteristic function in the integral layer architecture of FNOs.
DAFNO has achieved state-of-the-art accuracy as compared to baseline neural operator models.
arXiv Detail & Related papers (2023-04-30T13:29:06Z) - Nonlocality and Nonlinearity Implies Universality in Operator Learning [8.83910715280152]
Neural operator architectures approximate operators between infinite-dimensional Banach spaces of functions.
It is clear that any general approximation of operators between spaces of functions must be both nonlocal and nonlinear.
We show how these two attributes may be combined in a simple way to deduce universal approximation.
arXiv Detail & Related papers (2023-04-26T01:03:11Z) - Resolution-Invariant Image Classification based on Fourier Neural
Operators [1.3190581566723918]
We investigate the use of generalization Neural Operators (FNOs) for image classification in comparison to standard Convolutional Neural Networks (CNNs)
We derive the FNO architecture as an example for continuous and Fr'echet-differentiable neural operators on Lebesgue spaces.
arXiv Detail & Related papers (2023-04-02T10:23:36Z) - Factorized Fourier Neural Operators [77.47313102926017]
The Factorized Fourier Neural Operator (F-FNO) is a learning-based method for simulating partial differential equations.
We show that our model maintains an error rate of 2% while still running an order of magnitude faster than a numerical solver.
arXiv Detail & Related papers (2021-11-27T03:34:13Z) - Physics-Informed Neural Operator for Learning Partial Differential
Equations [55.406540167010014]
PINO is the first hybrid approach incorporating data and PDE constraints at different resolutions to learn the operator.
The resulting PINO model can accurately approximate the ground-truth solution operator for many popular PDE families.
arXiv Detail & Related papers (2021-11-06T03:41:34Z) - Neural Operator: Learning Maps Between Function Spaces [75.93843876663128]
We propose a generalization of neural networks to learn operators, termed neural operators, that map between infinite dimensional function spaces.
We prove a universal approximation theorem for our proposed neural operator, showing that it can approximate any given nonlinear continuous operator.
An important application for neural operators is learning surrogate maps for the solution operators of partial differential equations.
arXiv Detail & Related papers (2021-08-19T03:56:49Z) - Provably Efficient Neural Estimation of Structural Equation Model: An
Adversarial Approach [144.21892195917758]
We study estimation in a class of generalized Structural equation models (SEMs)
We formulate the linear operator equation as a min-max game, where both players are parameterized by neural networks (NNs), and learn the parameters of these neural networks using a gradient descent.
For the first time we provide a tractable estimation procedure for SEMs based on NNs with provable convergence and without the need for sample splitting.
arXiv Detail & Related papers (2020-07-02T17:55:47Z) - Self-Organized Operational Neural Networks with Generative Neurons [87.32169414230822]
ONNs are heterogenous networks with a generalized neuron model that can encapsulate any set of non-linear operators.
We propose Self-organized ONNs (Self-ONNs) with generative neurons that have the ability to adapt (optimize) the nodal operator of each connection.
arXiv Detail & Related papers (2020-04-24T14:37:56Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.