Space-time deep neural network approximations for high-dimensional partial differential equations
- URL: http://arxiv.org/abs/2006.02199v2
- Date: Mon, 3 Jun 2024 15:54:30 GMT
- Title: Space-time deep neural network approximations for high-dimensional partial differential equations
- Authors: Fabian Hornung, Arnulf Jentzen, Diyora Salimova,
- Abstract summary: Deep learning approximations might have the capacity to overcome the curse of dimensionality.
This article proves for every $ainmathbbR$, $ bin (a,infty)$ that solutions of certain Kolmogorov PDEs can be approximated by DNNs without the curse of dimensionality.
- Score: 3.6185342807265415
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: It is one of the most challenging issues in applied mathematics to approximately solve high-dimensional partial differential equations (PDEs) and most of the numerical approximation methods for PDEs in the scientific literature suffer from the so-called curse of dimensionality in the sense that the number of computational operations employed in the corresponding approximation scheme to obtain an approximation precision $\varepsilon>0$ grows exponentially in the PDE dimension and/or the reciprocal of $\varepsilon$. Recently, certain deep learning based approximation methods for PDEs have been proposed and various numerical simulations for such methods suggest that deep neural network (DNN) approximations might have the capacity to indeed overcome the curse of dimensionality in the sense that the number of real parameters used to describe the approximating DNNs grows at most polynomially in both the PDE dimension $d\in\mathbb{N}$ and the reciprocal of the prescribed accuracy $\varepsilon>0$. There are now also a few rigorous results in the scientific literature which substantiate this conjecture by proving that DNNs overcome the curse of dimensionality in approximating solutions of PDEs. Each of these results establishes that DNNs overcome the curse of dimensionality in approximating suitable PDE solutions at a fixed time point $T>0$ and on a compact cube $[a,b]^d$ in space but none of these results provides an answer to the question whether the entire PDE solution on $[0,T]\times [a,b]^d$ can be approximated by DNNs without the curse of dimensionality. It is precisely the subject of this article to overcome this issue. More specifically, the main result of this work in particular proves for every $a\in\mathbb{R}$, $ b\in (a,\infty)$ that solutions of certain Kolmogorov PDEs can be approximated by DNNs on the space-time region $[0,T]\times [a,b]^d$ without the curse of dimensionality.
Related papers
- Deep neural networks with ReLU, leaky ReLU, and softplus activation provably overcome the curse of dimensionality for space-time solutions of semilinear partial differential equations [3.3123773366516645]
It is a challenging topic in applied mathematics to solve high-dimensional nonlinear partial differential equations (PDEs)
Deep learning (DL) based methods for PDEs in which deep neural networks (DNNs) are used to approximate solutions of PDEs are presented.
arXiv Detail & Related papers (2024-06-16T09:59:29Z) - Unisolver: PDE-Conditional Transformers Are Universal PDE Solvers [55.0876373185983]
We present the Universal PDE solver (Unisolver) capable of solving a wide scope of PDEs.
Our key finding is that a PDE solution is fundamentally under the control of a series of PDE components.
Unisolver achieves consistent state-of-the-art results on three challenging large-scale benchmarks.
arXiv Detail & Related papers (2024-05-27T15:34:35Z) - Deep Equilibrium Based Neural Operators for Steady-State PDEs [100.88355782126098]
We study the benefits of weight-tied neural network architectures for steady-state PDEs.
We propose FNO-DEQ, a deep equilibrium variant of the FNO architecture that directly solves for the solution of a steady-state PDE.
arXiv Detail & Related papers (2023-11-30T22:34:57Z) - Deep neural networks with ReLU, leaky ReLU, and softplus activation
provably overcome the curse of dimensionality for Kolmogorov partial
differential equations with Lipschitz nonlinearities in the $L^p$-sense [3.3123773366516645]
We show that deep neural networks (DNNs) have the expressive power to approximate PDE solutions without the curse of dimensionality (COD)
It is the key contribution of this work to generalize this result by establishing this statement in the $Lp$-sense with $pin(0,infty)$.
arXiv Detail & Related papers (2023-09-24T18:58:18Z) - Tackling the Curse of Dimensionality with Physics-Informed Neural Networks [24.86574584293979]
We develop a new method of scaling up physics-informed neural networks (PINNs) to solve arbitrary high-dimensional PDEs.
We demonstrate in various tests that the proposed method can solve many notoriously hard high-dimensional PDEs.
arXiv Detail & Related papers (2023-07-23T12:18:12Z) - Solving High-Dimensional PDEs with Latent Spectral Models [74.1011309005488]
We present Latent Spectral Models (LSM) toward an efficient and precise solver for high-dimensional PDEs.
Inspired by classical spectral methods in numerical analysis, we design a neural spectral block to solve PDEs in the latent space.
LSM achieves consistent state-of-the-art and yields a relative gain of 11.5% averaged on seven benchmarks.
arXiv Detail & Related papers (2023-01-30T04:58:40Z) - MAgNet: Mesh Agnostic Neural PDE Solver [68.8204255655161]
Climate predictions require fine-temporal resolutions to resolve all turbulent scales in the fluid simulations.
Current numerical model solveers PDEs on grids that are too coarse (3km to 200km on each side)
We design a novel architecture that predicts the spatially continuous solution of a PDE given a spatial position query.
arXiv Detail & Related papers (2022-10-11T14:52:20Z) - Neural Q-learning for solving PDEs [0.0]
We develop a new numerical method for solving elliptic-type PDEs by adapting the Q-learning algorithm in reinforcement learning.
Our "Q-PDE" algorithm is mesh-free and therefore has the potential to overcome the curse of dimensionality.
The numerical performance of the Q-PDE algorithm is studied for several elliptic PDEs.
arXiv Detail & Related papers (2022-03-31T15:52:44Z) - Lie Point Symmetry Data Augmentation for Neural PDE Solvers [69.72427135610106]
We present a method, which can partially alleviate this problem, by improving neural PDE solver sample complexity.
In the context of PDEs, it turns out that we are able to quantitatively derive an exhaustive list of data transformations.
We show how it can easily be deployed to improve neural PDE solver sample complexity by an order of magnitude.
arXiv Detail & Related papers (2022-02-15T18:43:17Z) - On the Representation of Solutions to Elliptic PDEs in Barron Spaces [9.875204185976777]
This paper derives complexity estimates of the solutions of $d$dimensional second-order elliptic PDEs in the Barron space.
As a direct consequence of the complexity estimates, the solution of the PDE can be approximated on any bounded domain by a two-layer neural network.
arXiv Detail & Related papers (2021-06-14T16:05:07Z) - dNNsolve: an efficient NN-based PDE solver [62.997667081978825]
We introduce dNNsolve, that makes use of dual Neural Networks to solve ODEs/PDEs.
We show that dNNsolve is capable of solving a broad range of ODEs/PDEs in 1, 2 and 3 spacetime dimensions.
arXiv Detail & Related papers (2021-03-15T19:14:41Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.