A deep branching solver for fully nonlinear partial differential
equations
- URL: http://arxiv.org/abs/2203.03234v2
- Date: Sat, 9 Sep 2023 02:12:19 GMT
- Title: A deep branching solver for fully nonlinear partial differential
equations
- Authors: Jiang Yu Nguwi, Guillaume Penent, and Nicolas Privault
- Abstract summary: We present a multidimensional deep learning implementation of a branching algorithm for the numerical solution of fully nonlinear PDEs.
This approach is designed to tackle functional nonlinearities involving gradient terms of any orders.
- Score: 0.1474723404975345
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We present a multidimensional deep learning implementation of a stochastic
branching algorithm for the numerical solution of fully nonlinear PDEs. This
approach is designed to tackle functional nonlinearities involving gradient
terms of any orders, by combining the use of neural networks with a Monte Carlo
branching algorithm. In comparison with other deep learning PDE solvers, it
also allows us to check the consistency of the learned neural network function.
Numerical experiments presented show that this algorithm can outperform deep
learning approaches based on backward stochastic differential equations or the
Galerkin method, and provide solution estimates that are not obtained by those
methods in fully nonlinear examples.
Related papers
- Total Uncertainty Quantification in Inverse PDE Solutions Obtained with Reduced-Order Deep Learning Surrogate Models [50.90868087591973]
We propose an approximate Bayesian method for quantifying the total uncertainty in inverse PDE solutions obtained with machine learning surrogate models.
We test the proposed framework by comparing it with the iterative ensemble smoother and deep ensembling methods for a non-linear diffusion equation.
arXiv Detail & Related papers (2024-08-20T19:06:02Z) - A forward differential deep learning-based algorithm for solving high-dimensional nonlinear backward stochastic differential equations [0.6040014326756179]
We present a novel forward differential deep learning-based algorithm for solving high-dimensional nonlinear backward differential equations (BSDEs)
Motivated by the fact that differential deep learning can efficiently approximate the labels and their derivatives with respect to inputs, we transform the BSDE problem into a differential deep learning problem.
The main idea of our algorithm is to discretize the integrals using the Euler-Maruyama method and approximate the unknown discrete solution triple using three deep neural networks.
arXiv Detail & Related papers (2024-08-10T19:34:03Z) - A backward differential deep learning-based algorithm for solving high-dimensional nonlinear backward stochastic differential equations [0.6040014326756179]
We propose a novel backward differential deep learning-based algorithm for solving high-dimensional nonlinear backward differential equations.
The deep neural network (DNN) models are trained not only on the inputs and labels but also the differentials of the corresponding labels.
arXiv Detail & Related papers (2024-04-12T13:05:35Z) - Optimizing Solution-Samplers for Combinatorial Problems: The Landscape
of Policy-Gradient Methods [52.0617030129699]
We introduce a novel theoretical framework for analyzing the effectiveness of DeepMatching Networks and Reinforcement Learning methods.
Our main contribution holds for a broad class of problems including Max-and Min-Cut, Max-$k$-Bipartite-Bi, Maximum-Weight-Bipartite-Bi, and Traveling Salesman Problem.
As a byproduct of our analysis we introduce a novel regularization process over vanilla descent and provide theoretical and experimental evidence that it helps address vanishing-gradient issues and escape bad stationary points.
arXiv Detail & Related papers (2023-10-08T23:39:38Z) - Implicit Stochastic Gradient Descent for Training Physics-informed
Neural Networks [51.92362217307946]
Physics-informed neural networks (PINNs) have effectively been demonstrated in solving forward and inverse differential equation problems.
PINNs are trapped in training failures when the target functions to be approximated exhibit high-frequency or multi-scale features.
In this paper, we propose to employ implicit gradient descent (ISGD) method to train PINNs for improving the stability of training process.
arXiv Detail & Related papers (2023-03-03T08:17:47Z) - Monte Carlo Neural PDE Solver for Learning PDEs via Probabilistic Representation [59.45669299295436]
We propose a Monte Carlo PDE solver for training unsupervised neural solvers.
We use the PDEs' probabilistic representation, which regards macroscopic phenomena as ensembles of random particles.
Our experiments on convection-diffusion, Allen-Cahn, and Navier-Stokes equations demonstrate significant improvements in accuracy and efficiency.
arXiv Detail & Related papers (2023-02-10T08:05:19Z) - Deep learning numerical methods for high-dimensional fully nonlinear
PIDEs and coupled FBSDEs with jumps [26.28912742740653]
We propose a deep learning algorithm for solving high-dimensional parabolic integro-differential equations (PIDEs)
The jump-diffusion process are derived by a Brownian motion and an independent compensated Poisson random measure.
To derive the error estimates for this deep learning algorithm, the convergence of Markovian, the error bound of Euler time discretization, and the simulation error of deep learning algorithm are investigated.
arXiv Detail & Related papers (2023-01-30T13:55:42Z) - Inverse Problem of Nonlinear Schr\"odinger Equation as Learning of
Convolutional Neural Network [5.676923179244324]
It is shown that one can obtain a relatively accurate estimate of the considered parameters using the proposed method.
It provides a natural framework in inverse problems of partial differential equations with deep learning.
arXiv Detail & Related papers (2021-07-19T02:54:37Z) - Learning Fast Approximations of Sparse Nonlinear Regression [50.00693981886832]
In this work, we bridge the gap by introducing the Threshold Learned Iterative Shrinkage Algorithming (NLISTA)
Experiments on synthetic data corroborate our theoretical results and show our method outperforms state-of-the-art methods.
arXiv Detail & Related papers (2020-10-26T11:31:08Z) - Deep neural network for solving differential equations motivated by
Legendre-Galerkin approximation [16.64525769134209]
We explore the performance and accuracy of various neural architectures on both linear and nonlinear differential equations.
We implement a novel Legendre-Galerkin Deep Neural Network (LGNet) algorithm to predict solutions to differential equations.
arXiv Detail & Related papers (2020-10-24T20:25:09Z) - Multipole Graph Neural Operator for Parametric Partial Differential
Equations [57.90284928158383]
One of the main challenges in using deep learning-based methods for simulating physical systems is formulating physics-based data.
We propose a novel multi-level graph neural network framework that captures interaction at all ranges with only linear complexity.
Experiments confirm our multi-graph network learns discretization-invariant solution operators to PDEs and can be evaluated in linear time.
arXiv Detail & Related papers (2020-06-16T21:56:22Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.