Resolution-Invariant Image Classification based on Fourier Neural
Operators
- URL: http://arxiv.org/abs/2304.01227v1
- Date: Sun, 2 Apr 2023 10:23:36 GMT
- Title: Resolution-Invariant Image Classification based on Fourier Neural
Operators
- Authors: Samira Kabri, Tim Roith, Daniel Tenbrinck, Martin Burger
- Abstract summary: We investigate the use of generalization Neural Operators (FNOs) for image classification in comparison to standard Convolutional Neural Networks (CNNs)
We derive the FNO architecture as an example for continuous and Fr'echet-differentiable neural operators on Lebesgue spaces.
- Score: 1.3190581566723918
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In this paper we investigate the use of Fourier Neural Operators (FNOs) for
image classification in comparison to standard Convolutional Neural Networks
(CNNs). Neural operators are a discretization-invariant generalization of
neural networks to approximate operators between infinite dimensional function
spaces. FNOs - which are neural operators with a specific parametrization -
have been applied successfully in the context of parametric PDEs. We derive the
FNO architecture as an example for continuous and Fr\'echet-differentiable
neural operators on Lebesgue spaces. We further show how CNNs can be converted
into FNOs and vice versa and propose an interpolation-equivariant adaptation of
the architecture.
Related papers
- Neural Operators with Localized Integral and Differential Kernels [77.76991758980003]
We present a principled approach to operator learning that can capture local features under two frameworks.
We prove that we obtain differential operators under an appropriate scaling of the kernel values of CNNs.
To obtain local integral operators, we utilize suitable basis representations for the kernels based on discrete-continuous convolutions.
arXiv Detail & Related papers (2024-02-26T18:59:31Z) - Permutation Equivariant Neural Functionals [92.0667671999604]
This work studies the design of neural networks that can process the weights or gradients of other neural networks.
We focus on the permutation symmetries that arise in the weights of deep feedforward networks because hidden layer neurons have no inherent order.
In our experiments, we find that permutation equivariant neural functionals are effective on a diverse set of tasks.
arXiv Detail & Related papers (2023-02-27T18:52:38Z) - Interrelation of equivariant Gaussian processes and convolutional neural
networks [77.34726150561087]
Currently there exists rather promising new trend in machine leaning (ML) based on the relationship between neural networks (NN) and Gaussian processes (GP)
In this work we establish a relationship between the many-channel limit for CNNs equivariant with respect to two-dimensional Euclidean group with vector-valued neuron activations and the corresponding independently introduced equivariant Gaussian processes (GP)
arXiv Detail & Related papers (2022-09-17T17:02:35Z) - Bounding The Rademacher Complexity of Fourier Neural Operator [3.4960814625958787]
A Fourier neural operator (FNO) is one of the physics-inspired machine learning methods.
In this study, we investigated the bounding of the Rademacher complexity of the FNO based on specific group norms.
In addition, we investigated the correlation between the empirical generalization error and the proposed capacity of FNO.
arXiv Detail & Related papers (2022-09-12T11:11:43Z) - Pseudo-Differential Neural Operator: Generalized Fourier Neural Operator
for Learning Solution Operators of Partial Differential Equations [14.43135909469058]
We propose a novel textitpseudo-differential integral operator (PDIO) to analyze and generalize the Fourier integral operator in FNO.
We experimentally validate the effectiveness of the proposed model by utilizing Darcy flow and the Navier-Stokes equation.
arXiv Detail & Related papers (2022-01-28T07:22:32Z) - Nonlocal Kernel Network (NKN): a Stable and Resolution-Independent Deep
Neural Network [23.465930256410722]
Nonlocal kernel network (NKN) is resolution independent, characterized by deep neural networks.
NKN is capable of handling a variety of tasks such as learning governing equations and classifying images.
arXiv Detail & Related papers (2022-01-06T19:19:35Z) - Neural Operator: Learning Maps Between Function Spaces [75.93843876663128]
We propose a generalization of neural networks to learn operators, termed neural operators, that map between infinite dimensional function spaces.
We prove a universal approximation theorem for our proposed neural operator, showing that it can approximate any given nonlinear continuous operator.
An important application for neural operators is learning surrogate maps for the solution operators of partial differential equations.
arXiv Detail & Related papers (2021-08-19T03:56:49Z) - Stability of Algebraic Neural Networks to Small Perturbations [179.55535781816343]
Algebraic neural networks (AlgNNs) are composed of a cascade of layers each one associated to and algebraic signal model.
We show how any architecture that uses a formal notion of convolution can be stable beyond particular choices of the shift operator.
arXiv Detail & Related papers (2020-10-22T09:10:16Z) - Provably Efficient Neural Estimation of Structural Equation Model: An
Adversarial Approach [144.21892195917758]
We study estimation in a class of generalized Structural equation models (SEMs)
We formulate the linear operator equation as a min-max game, where both players are parameterized by neural networks (NNs), and learn the parameters of these neural networks using a gradient descent.
For the first time we provide a tractable estimation procedure for SEMs based on NNs with provable convergence and without the need for sample splitting.
arXiv Detail & Related papers (2020-07-02T17:55:47Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.