Solving high-dimensional Hamilton-Jacobi-Bellman PDEs using neural
networks: perspectives from the theory of controlled diffusions and measures
on path space
- URL: http://arxiv.org/abs/2005.05409v1
- Date: Mon, 11 May 2020 20:14:02 GMT
- Title: Solving high-dimensional Hamilton-Jacobi-Bellman PDEs using neural
networks: perspectives from the theory of controlled diffusions and measures
on path space
- Authors: Nikolas N\"usken, Lorenz Richter
- Abstract summary: Building on recent machine learning inspired approaches towards high-dimensional PDEs, we investigate the potential of iterative diffusion techniques.
We develop a principled framework based on divergences between path measures, encompassing various existing methods.
The promise of the developed approach is exemplified by a range of high-dimensional and metastable numerical examples.
- Score: 3.1219977244201056
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Optimal control of diffusion processes is intimately connected to the problem
of solving certain Hamilton-Jacobi-Bellman equations. Building on recent
machine learning inspired approaches towards high-dimensional PDEs, we
investigate the potential of iterative diffusion optimisation techniques, in
particular considering applications in importance sampling and rare event
simulation. The choice of an appropriate loss function being a central element
in the algorithmic design, we develop a principled framework based on
divergences between path measures, encompassing various existing methods.
Motivated by connections to forward-backward SDEs, we propose and study the
novel log-variance divergence, showing favourable properties of corresponding
Monte Carlo estimators. The promise of the developed approach is exemplified by
a range of high-dimensional and metastable numerical examples.
Related papers
- Total Uncertainty Quantification in Inverse PDE Solutions Obtained with Reduced-Order Deep Learning Surrogate Models [50.90868087591973]
We propose an approximate Bayesian method for quantifying the total uncertainty in inverse PDE solutions obtained with machine learning surrogate models.
We test the proposed framework by comparing it with the iterative ensemble smoother and deep ensembling methods for a non-linear diffusion equation.
arXiv Detail & Related papers (2024-08-20T19:06:02Z) - Embedding Trajectory for Out-of-Distribution Detection in Mathematical Reasoning [50.84938730450622]
We propose a trajectory-based method TV score, which uses trajectory volatility for OOD detection in mathematical reasoning.
Our method outperforms all traditional algorithms on GLMs under mathematical reasoning scenarios.
Our method can be extended to more applications with high-density features in output spaces, such as multiple-choice questions.
arXiv Detail & Related papers (2024-05-22T22:22:25Z) - Distributed Markov Chain Monte Carlo Sampling based on the Alternating
Direction Method of Multipliers [143.6249073384419]
In this paper, we propose a distributed sampling scheme based on the alternating direction method of multipliers.
We provide both theoretical guarantees of our algorithm's convergence and experimental evidence of its superiority to the state-of-the-art.
In simulation, we deploy our algorithm on linear and logistic regression tasks and illustrate its fast convergence compared to existing gradient-based methods.
arXiv Detail & Related papers (2024-01-29T02:08:40Z) - Approximation of Solution Operators for High-dimensional PDEs [2.3076986663832044]
We propose a finite-dimensional control-based method to approximate solution operators for evolutional partial differential equations.
Results are presented for several high-dimensional PDEs, including real-world applications to solving Hamilton-Jacobi-Bellman equations.
arXiv Detail & Related papers (2024-01-18T21:45:09Z) - A Geometric Perspective on Diffusion Models [57.27857591493788]
We inspect the ODE-based sampling of a popular variance-exploding SDE.
We establish a theoretical relationship between the optimal ODE-based sampling and the classic mean-shift (mode-seeking) algorithm.
arXiv Detail & Related papers (2023-05-31T15:33:16Z) - Monte Carlo Neural PDE Solver for Learning PDEs via Probabilistic Representation [59.45669299295436]
We propose a Monte Carlo PDE solver for training unsupervised neural solvers.
We use the PDEs' probabilistic representation, which regards macroscopic phenomena as ensembles of random particles.
Our experiments on convection-diffusion, Allen-Cahn, and Navier-Stokes equations demonstrate significant improvements in accuracy and efficiency.
arXiv Detail & Related papers (2023-02-10T08:05:19Z) - Solving High-Dimensional PDEs with Latent Spectral Models [74.1011309005488]
We present Latent Spectral Models (LSM) toward an efficient and precise solver for high-dimensional PDEs.
Inspired by classical spectral methods in numerical analysis, we design a neural spectral block to solve PDEs in the latent space.
LSM achieves consistent state-of-the-art and yields a relative gain of 11.5% averaged on seven benchmarks.
arXiv Detail & Related papers (2023-01-30T04:58:40Z) - An optimal control perspective on diffusion-based generative modeling [9.806130366152194]
We establish a connection between optimal control and generative models based on differential equations (SDEs)
In particular, we derive a Hamilton-Jacobi-Bellman equation that governs the evolution of the log-densities of the underlying SDE marginals.
We develop a novel diffusion-based method for sampling from unnormalized densities.
arXiv Detail & Related papers (2022-11-02T17:59:09Z) - Deep Learning Aided Laplace Based Bayesian Inference for Epidemiological
Systems [2.596903831934905]
We propose a hybrid approach where Laplace-based Bayesian inference is combined with an ANN architecture for obtaining approximations to the ODE trajectories.
The effectiveness of our proposed methods is demonstrated using an epidemiological system with non-analytical solutions, the Susceptible-Infectious-Removed (SIR) model for infectious diseases.
arXiv Detail & Related papers (2022-10-17T09:02:41Z) - A Deep Learning approach to Reduced Order Modelling of Parameter
Dependent Partial Differential Equations [0.2148535041822524]
We develop a constructive approach based on Deep Neural Networks for the efficient approximation of the parameter-to-solution map.
In particular, we consider parametrized advection-diffusion PDEs, and we test the methodology in the presence of strong transport fields.
arXiv Detail & Related papers (2021-03-10T17:01:42Z) - Model Reduction and Neural Networks for Parametric PDEs [9.405458160620533]
We develop a framework for data-driven approximation of input-output maps between infinite-dimensional spaces.
The proposed approach is motivated by the recent successes of neural networks and deep learning.
For a class of input-output maps, and suitably chosen probability measures on the inputs, we prove convergence of the proposed approximation methodology.
arXiv Detail & Related papers (2020-05-07T00:09:27Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.