Deep neural network approximation for high-dimensional elliptic PDEs
with boundary conditions
- URL: http://arxiv.org/abs/2007.05384v2
- Date: Mon, 17 Aug 2020 11:53:24 GMT
- Title: Deep neural network approximation for high-dimensional elliptic PDEs
with boundary conditions
- Authors: Philipp Grohs and Lukas Herrmann
- Abstract summary: Deep neural networks are capable of approximating solutions to a class of parabolic partial differential equations without incurring the curse of dimension.
The present paper considers an important such model problem, namely the Poisson equation on a domain $Dsubset mathbbRd$ subject to Dirichlet boundary conditions.
It is shown that deep neural networks are capable of representing solutions of that problem without incurring the curse of dimension.
- Score: 6.079011829257036
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In recent work it has been established that deep neural networks are capable
of approximating solutions to a large class of parabolic partial differential
equations without incurring the curse of dimension. However, all this work has
been restricted to problems formulated on the whole Euclidean domain. On the
other hand, most problems in engineering and the sciences are formulated on
finite domains and subjected to boundary conditions. The present paper
considers an important such model problem, namely the Poisson equation on a
domain $D\subset \mathbb{R}^d$ subject to Dirichlet boundary conditions. It is
shown that deep neural networks are capable of representing solutions of that
problem without incurring the curse of dimension. The proofs are based on a
probabilistic representation of the solution to the Poisson equation as well as
a suitable sampling method.
Related papers
- Solving Poisson Equations using Neural Walk-on-Spheres [80.1675792181381]
We propose Neural Walk-on-Spheres (NWoS), a novel neural PDE solver for the efficient solution of high-dimensional Poisson equations.
We demonstrate the superiority of NWoS in accuracy, speed, and computational costs.
arXiv Detail & Related papers (2024-06-05T17:59:22Z) - Optimizing Solution-Samplers for Combinatorial Problems: The Landscape
of Policy-Gradient Methods [52.0617030129699]
We introduce a novel theoretical framework for analyzing the effectiveness of DeepMatching Networks and Reinforcement Learning methods.
Our main contribution holds for a broad class of problems including Max-and Min-Cut, Max-$k$-Bipartite-Bi, Maximum-Weight-Bipartite-Bi, and Traveling Salesman Problem.
As a byproduct of our analysis we introduce a novel regularization process over vanilla descent and provide theoretical and experimental evidence that it helps address vanishing-gradient issues and escape bad stationary points.
arXiv Detail & Related papers (2023-10-08T23:39:38Z) - A Stable and Scalable Method for Solving Initial Value PDEs with Neural
Networks [52.5899851000193]
We develop an ODE based IVP solver which prevents the network from getting ill-conditioned and runs in time linear in the number of parameters.
We show that current methods based on this approach suffer from two key issues.
First, following the ODE produces an uncontrolled growth in the conditioning of the problem, ultimately leading to unacceptably large numerical errors.
arXiv Detail & Related papers (2023-04-28T17:28:18Z) - Neural PDE Solvers for Irregular Domains [25.673617202478606]
We present a framework to neurally solve partial differential equations over domains with irregularly shaped geometric boundaries.
Our network takes in the shape of the domain as an input and is able to generalize to novel (unseen) irregular domains.
arXiv Detail & Related papers (2022-11-07T00:00:30Z) - Deep NURBS -- Admissible Physics-informed Neural Networks [0.0]
We propose a new numerical scheme for physics-informed neural networks (PINNs) that enables precise and inexpensive solution for partial differential equations (PDEs)
The proposed approach combines admissible NURBS parametrizations required to define the physical domain and the Dirichlet boundary conditions with a PINN solver.
arXiv Detail & Related papers (2022-10-25T10:35:45Z) - Message Passing Neural PDE Solvers [60.77761603258397]
We build a neural message passing solver, replacing allally designed components in the graph with backprop-optimized neural function approximators.
We show that neural message passing solvers representationally contain some classical methods, such as finite differences, finite volumes, and WENO schemes.
We validate our method on various fluid-like flow problems, demonstrating fast, stable, and accurate performance across different domain topologies, equation parameters, discretizations, etc., in 1D and 2D.
arXiv Detail & Related papers (2022-02-07T17:47:46Z) - Solving PDEs on Unknown Manifolds with Machine Learning [8.220217498103315]
This paper presents a mesh-free computational framework and machine learning theory for solving elliptic PDEs on unknown manifold.
We show that the proposed NN solver can robustly generalize the PDE on new data points with errors that are almost identical to generalizations on new data points.
arXiv Detail & Related papers (2021-06-12T03:55:15Z) - Exact imposition of boundary conditions with distance functions in
physics-informed deep neural networks [0.5804039129951741]
We introduce geometry-aware trial functions in artifical neural networks to improve the training in deep learning for partial differential equations.
To exactly impose homogeneous Dirichlet boundary conditions, the trial function is taken as $phi$ multiplied by the PINN approximation.
We present numerical solutions for linear and nonlinear boundary-value problems over domains with affine and curved boundaries.
arXiv Detail & Related papers (2021-04-17T03:02:52Z) - dNNsolve: an efficient NN-based PDE solver [62.997667081978825]
We introduce dNNsolve, that makes use of dual Neural Networks to solve ODEs/PDEs.
We show that dNNsolve is capable of solving a broad range of ODEs/PDEs in 1, 2 and 3 spacetime dimensions.
arXiv Detail & Related papers (2021-03-15T19:14:41Z) - Parametric Complexity Bounds for Approximating PDEs with Neural Networks [41.46028070204925]
We prove that when a PDE's coefficients are representable by small neural networks, the parameters required to approximate its solution scalely with the input $d$ are proportional to the parameter counts of the neural networks.
Our proof is based on constructing a neural network which simulates gradient descent in an appropriate space which converges to the solution of the PDE.
arXiv Detail & Related papers (2021-03-03T02:42:57Z) - Convex Geometry and Duality of Over-parameterized Neural Networks [70.15611146583068]
We develop a convex analytic approach to analyze finite width two-layer ReLU networks.
We show that an optimal solution to the regularized training problem can be characterized as extreme points of a convex set.
In higher dimensions, we show that the training problem can be cast as a finite dimensional convex problem with infinitely many constraints.
arXiv Detail & Related papers (2020-02-25T23:05:33Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.