Can neural operators always be continuously discretized?
- URL: http://arxiv.org/abs/2412.03393v1
- Date: Wed, 04 Dec 2024 15:22:54 GMT
- Title: Can neural operators always be continuously discretized?
- Authors: Takashi Furuya, Michael Puthawala, Maarten V. de Hoop, Matti Lassas,
- Abstract summary: We consider the problem of discretization of neural operators between Hilbert spaces in a general framework including skip connections.
We show that bilipschitz neural operators may always be written in the form of an alternating composition of strongly monotone neural operators.
We also show that neural operators of this type may be approximated through the composition of finite-rank residual neural operators.
- Score: 7.37972671531752
- License:
- Abstract: We consider the problem of discretization of neural operators between Hilbert spaces in a general framework including skip connections. We focus on bijective neural operators through the lens of diffeomorphisms in infinite dimensions. Framed using category theory, we give a no-go theorem that shows that diffeomorphisms between Hilbert spaces or Hilbert manifolds may not admit any continuous approximations by diffeomorphisms on finite-dimensional spaces, even if the approximations are nonlinear. The natural way out is the introduction of strongly monotone diffeomorphisms and layerwise strongly monotone neural operators which have continuous approximations by strongly monotone diffeomorphisms on finite-dimensional spaces. For these, one can guarantee discretization invariance, while ensuring that finite-dimensional approximations converge not only as sequences of functions, but that their representations converge in a suitable sense as well. Finally, we show that bilipschitz neural operators may always be written in the form of an alternating composition of strongly monotone neural operators, plus a simple isometry. Thus we realize a rigorous platform for discretization of a generalization of a neural operator. We also show that neural operators of this type may be approximated through the composition of finite-rank residual neural operators, where each block is strongly monotone, and may be inverted locally via iteration. We conclude by providing a quantitative approximation result for the discretization of general bilipschitz neural operators.
Related papers
- Convergence analysis of wide shallow neural operators within the framework of Neural Tangent Kernel [4.313136216120379]
We conduct the convergence analysis of gradient descent for the wide shallow neural operators and physics-informed shallow neural operators within the framework of Neural Tangent Kernel (NTK)
Under the setting of over-parametrization, gradient descent can find the global minimum regardless of whether it is in continuous time or discrete time.
arXiv Detail & Related papers (2024-12-07T05:47:28Z) - Interpolating between R\'enyi entanglement entropies for arbitrary
bipartitions via operator geometric means [0.0]
We introduce a new construction of subadditive and submultiplicative monotones in terms of a regularized R'enyi divergence.
We show that they can be combined in a nontrivial way using weighted operator geometric means.
In addition, we find lower bounds on the new functionals that are superadditive and supermultiplicative.
arXiv Detail & Related papers (2022-08-30T17:56:53Z) - Neural and spectral operator surrogates: unified construction and
expression rate bounds [0.46040036610482665]
We study approximation rates for deep surrogates of maps between infinite-dimensional function spaces.
Operator in- and outputs from function spaces are assumed to be parametrized by stable, affine representation systems.
arXiv Detail & Related papers (2022-07-11T15:35:14Z) - Algebraic function based Banach space valued ordinary and fractional
neural network approximations [0.0]
approximations are pointwise and of the uniform norm.
The related Banach space valued feed-forward neural networks are with one hidden layer.
arXiv Detail & Related papers (2022-02-11T20:08:52Z) - Convolutional Filtering and Neural Networks with Non Commutative
Algebras [153.20329791008095]
We study the generalization of non commutative convolutional neural networks.
We show that non commutative convolutional architectures can be stable to deformations on the space of operators.
arXiv Detail & Related papers (2021-08-23T04:22:58Z) - Neural Operator: Learning Maps Between Function Spaces [75.93843876663128]
We propose a generalization of neural networks to learn operators, termed neural operators, that map between infinite dimensional function spaces.
We prove a universal approximation theorem for our proposed neural operator, showing that it can approximate any given nonlinear continuous operator.
An important application for neural operators is learning surrogate maps for the solution operators of partial differential equations.
arXiv Detail & Related papers (2021-08-19T03:56:49Z) - The Separation Capacity of Random Neural Networks [78.25060223808936]
We show that a sufficiently large two-layer ReLU-network with standard Gaussian weights and uniformly distributed biases can solve this problem with high probability.
We quantify the relevant structure of the data in terms of a novel notion of mutual complexity.
arXiv Detail & Related papers (2021-07-31T10:25:26Z) - Coupling-based Invertible Neural Networks Are Universal Diffeomorphism
Approximators [72.62940905965267]
Invertible neural networks based on coupling flows (CF-INNs) have various machine learning applications such as image synthesis and representation learning.
Are CF-INNs universal approximators for invertible functions?
We prove a general theorem to show the equivalence of the universality for certain diffeomorphism classes.
arXiv Detail & Related papers (2020-06-20T02:07:37Z) - Models of zero-range interaction for the bosonic trimer at unitarity [91.3755431537592]
We present the construction of quantum Hamiltonians for a three-body system consisting of identical bosons mutually coupled by a two-body interaction of zero range.
For a large part of the presentation, infinite scattering length will be considered.
arXiv Detail & Related papers (2020-06-03T17:54:43Z) - Quantum Geometric Confinement and Dynamical Transmission in Grushin
Cylinder [68.8204255655161]
We classify the self-adjoint realisations of the Laplace-Beltrami operator minimally defined on an infinite cylinder.
We retrieve those distinguished extensions previously identified in the recent literature, namely the most confining and the most transmitting.
arXiv Detail & Related papers (2020-03-16T11:37:23Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.