Uncertainty quantification for deep learning-based schemes for solving
high-dimensional backward stochastic differential equations
- URL: http://arxiv.org/abs/2310.03393v1
- Date: Thu, 5 Oct 2023 09:00:48 GMT
- Title: Uncertainty quantification for deep learning-based schemes for solving
high-dimensional backward stochastic differential equations
- Authors: Lorenc Kapllani, Long Teng and Matthias Rottmann
- Abstract summary: We study uncertainty quantification (UQ) for a class of deep learning-based BSDE schemes.
We develop a UQ model that efficiently estimates the STD of the approximate solution using only a single run of the algorithm.
Our numerical experiments show that the UQ model produces reliable estimates of the mean and STD of the approximate solution.
- Score: 5.883258964010963
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Deep learning-based numerical schemes for solving high-dimensional backward
stochastic differential equations (BSDEs) have recently raised plenty of
scientific interest. While they enable numerical methods to approximate very
high-dimensional BSDEs, their reliability has not been studied and is thus not
understood. In this work, we study uncertainty quantification (UQ) for a class
of deep learning-based BSDE schemes. More precisely, we review the sources of
uncertainty involved in the schemes and numerically study the impact of
different sources. Usually, the standard deviation (STD) of the approximate
solutions obtained from multiple runs of the algorithm with different datasets
is calculated to address the uncertainty. This approach is computationally
quite expensive, especially for high-dimensional problems. Hence, we develop a
UQ model that efficiently estimates the STD of the approximate solution using
only a single run of the algorithm. The model also estimates the mean of the
approximate solution, which can be leveraged to initialize the algorithm and
improve the optimization process. Our numerical experiments show that the UQ
model produces reliable estimates of the mean and STD of the approximate
solution for the considered class of deep learning-based BSDE schemes. The
estimated STD captures multiple sources of uncertainty, demonstrating its
effectiveness in quantifying the uncertainty. Additionally, the model
illustrates the improved performance when comparing different schemes based on
the estimated STD values. Furthermore, it can identify hyperparameter values
for which the scheme achieves good approximations.
Related papers
- Learning Controlled Stochastic Differential Equations [61.82896036131116]
This work proposes a novel method for estimating both drift and diffusion coefficients of continuous, multidimensional, nonlinear controlled differential equations with non-uniform diffusion.
We provide strong theoretical guarantees, including finite-sample bounds for (L2), (Linfty), and risk metrics, with learning rates adaptive to coefficients' regularity.
Our method is available as an open-source Python library.
arXiv Detail & Related papers (2024-11-04T11:09:58Z) - A Training-Free Conditional Diffusion Model for Learning Stochastic Dynamical Systems [10.820654486318336]
This study introduces a training-free conditional diffusion model for learning unknown differential equations (SDEs) using data.
The proposed approach addresses key challenges in computational efficiency and accuracy for modeling SDEs.
The learned models exhibit significant improvements in predicting both short-term and long-term behaviors of unknown systems.
arXiv Detail & Related papers (2024-10-04T03:07:36Z) - Total Uncertainty Quantification in Inverse PDE Solutions Obtained with Reduced-Order Deep Learning Surrogate Models [50.90868087591973]
We propose an approximate Bayesian method for quantifying the total uncertainty in inverse PDE solutions obtained with machine learning surrogate models.
We test the proposed framework by comparing it with the iterative ensemble smoother and deep ensembling methods for a non-linear diffusion equation.
arXiv Detail & Related papers (2024-08-20T19:06:02Z) - Uncertainty Quantification for Forward and Inverse Problems of PDEs via
Latent Global Evolution [110.99891169486366]
We propose a method that integrates efficient and precise uncertainty quantification into a deep learning-based surrogate model.
Our method endows deep learning-based surrogate models with robust and efficient uncertainty quantification capabilities for both forward and inverse problems.
Our method excels at propagating uncertainty over extended auto-regressive rollouts, making it suitable for scenarios involving long-term predictions.
arXiv Detail & Related papers (2024-02-13T11:22:59Z) - Equation Discovery with Bayesian Spike-and-Slab Priors and Efficient Kernels [57.46832672991433]
We propose a novel equation discovery method based on Kernel learning and BAyesian Spike-and-Slab priors (KBASS)
We use kernel regression to estimate the target function, which is flexible, expressive, and more robust to data sparsity and noises.
We develop an expectation-propagation expectation-maximization algorithm for efficient posterior inference and function estimation.
arXiv Detail & Related papers (2023-10-09T03:55:09Z) - Optimal Learning via Moderate Deviations Theory [4.6930976245638245]
We develop a systematic construction of highly accurate confidence intervals by using a moderate deviation principle-based approach.
It is shown that the proposed confidence intervals are statistically optimal in the sense that they satisfy criteria regarding exponential accuracy, minimality, consistency, mischaracterization probability, and eventual uniformly most accurate (UMA) property.
arXiv Detail & Related papers (2023-05-23T19:57:57Z) - Validation Diagnostics for SBI algorithms based on Normalizing Flows [55.41644538483948]
This work proposes easy to interpret validation diagnostics for multi-dimensional conditional (posterior) density estimators based on NF.
It also offers theoretical guarantees based on results of local consistency.
This work should help the design of better specified models or drive the development of novel SBI-algorithms.
arXiv Detail & Related papers (2022-11-17T15:48:06Z) - Distributional Gradient Matching for Learning Uncertain Neural Dynamics
Models [38.17499046781131]
We propose a novel approach towards estimating uncertain neural ODEs, avoiding the numerical integration bottleneck.
Our algorithm - distributional gradient matching (DGM) - jointly trains a smoother and a dynamics model and matches their gradients via minimizing a Wasserstein loss.
Our experiments show that, compared to traditional approximate inference methods based on numerical integration, our approach is faster to train, faster at predicting previously unseen trajectories, and in the context of neural ODEs, significantly more accurate.
arXiv Detail & Related papers (2021-06-22T08:40:51Z) - NP-ODE: Neural Process Aided Ordinary Differential Equations for
Uncertainty Quantification of Finite Element Analysis [2.9210447295585724]
A physics-informed data-driven surrogate model, named Neural Process Aided Ordinary Differential Equation (NP-ODE), is proposed to model the FEA simulations.
The results show that the proposed NP-ODE outperforms benchmark methods.
arXiv Detail & Related papers (2020-12-12T22:38:16Z) - The Seven-League Scheme: Deep learning for large time step Monte Carlo
simulations of stochastic differential equations [0.0]
We propose an accurate data-driven numerical scheme to solve Differential Equations (SDEs)
The SDE discretization is built up by means of a chaos expansion method on the basis of accurately determined (SC) points.
With a method called the compression-decompression and collocation technique, we can drastically reduce the number of neural network functions that have to be learned.
arXiv Detail & Related papers (2020-09-07T16:06:20Z) - Instability, Computational Efficiency and Statistical Accuracy [101.32305022521024]
We develop a framework that yields statistical accuracy based on interplay between the deterministic convergence rate of the algorithm at the population level, and its degree of (instability) when applied to an empirical object based on $n$ samples.
We provide applications of our general results to several concrete classes of models, including Gaussian mixture estimation, non-linear regression models, and informative non-response models.
arXiv Detail & Related papers (2020-05-22T22:30:52Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.