Optimal deep learning of holomorphic operators between Banach spaces
- URL: http://arxiv.org/abs/2406.13928v2
- Date: Wed, 30 Oct 2024 15:34:22 GMT
- Title: Optimal deep learning of holomorphic operators between Banach spaces
- Authors: Ben Adcock, Nick Dexter, Sebastian Moraga,
- Abstract summary: We tackle the problem of learning operators between Banach spaces, in contrast to the vast majority of past works considering only Hilbert spaces.
We combine arbitrary approximate encoders and decoders with standard feedforward Deep Neural Network (DNN) architectures.
We show that DL is optimal for this problem: no recovery procedure can surpass these generalization bounds up to log terms.
- Score: 0.6554326244334866
- License:
- Abstract: Operator learning problems arise in many key areas of scientific computing where Partial Differential Equations (PDEs) are used to model physical systems. In such scenarios, the operators map between Banach or Hilbert spaces. In this work, we tackle the problem of learning operators between Banach spaces, in contrast to the vast majority of past works considering only Hilbert spaces. We focus on learning holomorphic operators - an important class of problems with many applications. We combine arbitrary approximate encoders and decoders with standard feedforward Deep Neural Network (DNN) architectures - specifically, those with constant width exceeding the depth - under standard $\ell^2$-loss minimization. We first identify a family of DNNs such that the resulting Deep Learning (DL) procedure achieves optimal generalization bounds for such operators. For standard fully-connected architectures, we then show that there are uncountably many minimizers of the training problem that yield equivalent optimal performance. The DNN architectures we consider are `problem agnostic', with width and depth only depending on the amount of training data $m$ and not on regularity assumptions of the target operator. Next, we show that DL is optimal for this problem: no recovery procedure can surpass these generalization bounds up to log terms. Finally, we present numerical results demonstrating the practical performance on challenging problems including the parametric diffusion, Navier-Stokes-Brinkman and Boussinesq PDEs.
Related papers
- Physics-Informed Deep Inverse Operator Networks for Solving PDE Inverse Problems [1.9490282165104331]
Inverse problems involving partial differential equations (PDEs) can be seen as discovering a mapping from measurement data to unknown quantities.
Existing methods typically rely on large amounts of labeled training data, which is impractical for most real-world applications.
We propose a novel architecture called Physics-Informed Deep Inverse Operator Networks (PI-DIONs) which can learn the solution operator of PDE-based inverse problems without labeled training data.
arXiv Detail & Related papers (2024-12-04T09:38:58Z) - Learning Partial Differential Equations with Deep Parallel Neural Operator [11.121415128908566]
A novel methodology is to learn an operator as a means of approximating the mapping between outputs.
In practical physical science problems, the numerical solutions of partial differential equations are complex.
We propose a deep parallel operator model (DPNO) for efficiently and accurately solving partial differential equations.
arXiv Detail & Related papers (2024-09-30T06:04:04Z) - Neural Operators with Localized Integral and Differential Kernels [77.76991758980003]
We present a principled approach to operator learning that can capture local features under two frameworks.
We prove that we obtain differential operators under an appropriate scaling of the kernel values of CNNs.
To obtain local integral operators, we utilize suitable basis representations for the kernels based on discrete-continuous convolutions.
arXiv Detail & Related papers (2024-02-26T18:59:31Z) - GIT-Net: Generalized Integral Transform for Operator Learning [58.13313857603536]
This article introduces GIT-Net, a deep neural network architecture for approximating Partial Differential Equation (PDE) operators.
GIT-Net harnesses the fact that differential operators commonly used for defining PDEs can often be represented parsimoniously when expressed in specialized functional bases.
Numerical experiments demonstrate that GIT-Net is a competitive neural network operator, exhibiting small test errors and low evaluations across a range of PDE problems.
arXiv Detail & Related papers (2023-12-05T03:03:54Z) - Multi-Grid Tensorized Fourier Neural Operator for High-Resolution PDEs [93.82811501035569]
We introduce a new data efficient and highly parallelizable operator learning approach with reduced memory requirement and better generalization.
MG-TFNO scales to large resolutions by leveraging local and global structures of full-scale, real-world phenomena.
We demonstrate superior performance on the turbulent Navier-Stokes equations where we achieve less than half the error with over 150x compression.
arXiv Detail & Related papers (2023-09-29T20:18:52Z) - Learning Only On Boundaries: a Physics-Informed Neural operator for
Solving Parametric Partial Differential Equations in Complex Geometries [10.250994619846416]
We present a novel physics-informed neural operator method to solve parametrized boundary value problems without labeled data.
Our numerical experiments show the effectiveness of parametrized complex geometries and unbounded problems.
arXiv Detail & Related papers (2023-08-24T17:29:57Z) - Solving High-Dimensional PDEs with Latent Spectral Models [74.1011309005488]
We present Latent Spectral Models (LSM) toward an efficient and precise solver for high-dimensional PDEs.
Inspired by classical spectral methods in numerical analysis, we design a neural spectral block to solve PDEs in the latent space.
LSM achieves consistent state-of-the-art and yields a relative gain of 11.5% averaged on seven benchmarks.
arXiv Detail & Related papers (2023-01-30T04:58:40Z) - Deep Operator Learning Lessens the Curse of Dimensionality for PDEs [11.181533339111853]
This paper provides an estimate for the generalization error of learning Lipschitz operators over Banach spaces using DNNs with applications to various PDE solution operators.
Under mild assumptions on data distributions or operator structures, our analysis shows that deep operator learning can have a relaxed dependence on the discretization resolution of PDEs.
arXiv Detail & Related papers (2023-01-28T15:35:52Z) - Neural Operator: Learning Maps Between Function Spaces [75.93843876663128]
We propose a generalization of neural networks to learn operators, termed neural operators, that map between infinite dimensional function spaces.
We prove a universal approximation theorem for our proposed neural operator, showing that it can approximate any given nonlinear continuous operator.
An important application for neural operators is learning surrogate maps for the solution operators of partial differential equations.
arXiv Detail & Related papers (2021-08-19T03:56:49Z) - Deep Neural Networks Are Effective At Learning High-Dimensional
Hilbert-Valued Functions From Limited Data [6.098254376499899]
We focus on approximating functions that are Hilbert-valued, i.e. take values in a separable, but typically infinite-dimensional, Hilbert space.
We present a novel result on DNN training for holomorphic functions with so-called hidden anisotropy.
We show that there exists a procedure for learning Hilbert-valued functions via DNNs that performs as well as, but no better than current best-in-class schemes.
arXiv Detail & Related papers (2020-12-11T02:02:14Z) - Fourier Neural Operator for Parametric Partial Differential Equations [57.90284928158383]
We formulate a new neural operator by parameterizing the integral kernel directly in Fourier space.
We perform experiments on Burgers' equation, Darcy flow, and Navier-Stokes equation.
It is up to three orders of magnitude faster compared to traditional PDE solvers.
arXiv Detail & Related papers (2020-10-18T00:34:21Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.