History-Based, Bayesian, Closure for Stochastic Parameterization:
Application to Lorenz '96
- URL: http://arxiv.org/abs/2210.14488v1
- Date: Wed, 26 Oct 2022 05:22:50 GMT
- Title: History-Based, Bayesian, Closure for Stochastic Parameterization:
Application to Lorenz '96
- Authors: Mohamed Aziz Bhouri and Pierre Gentine
- Abstract summary: We develop a new type of parameterization based on a Bayesian formalism for neural networks, to account for uncertainty quantification.
We apply the proposed Bayesian history-based parameterization to the Lorenz '96 model in the presence of noisy and sparse data.
This approach paves the way for the use of Bayesian approaches for closure problems.
- Score: 0.09137554315375918
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Physical parameterizations are used as representations of unresolved subgrid
processes within weather and global climate models or coarse-scale turbulent
models, whose resolutions are too coarse to resolve small-scale processes.
These parameterizations are typically grounded on physically-based, yet
empirical, representations of the underlying small-scale processes. Machine
learning-based parameterizations have recently been proposed as an alternative
and have shown great promises to reduce uncertainties associated with
small-scale processes. Yet, those approaches still show some important
mismatches that are often attributed to stochasticity in the considered
process. This stochasticity can be due to noisy data, unresolved variables or
simply to the inherent chaotic nature of the process. To address these issues,
we develop a new type of parameterization (closure) which is based on a
Bayesian formalism for neural networks, to account for uncertainty
quantification, and includes memory, to account for the non-instantaneous
response of the closure. To overcome the curse of dimensionality of Bayesian
techniques in high-dimensional spaces, the Bayesian strategy is based on a
Hamiltonian Monte Carlo Markov Chain sampling strategy that takes advantage of
the likelihood function and kinetic energy's gradients with respect to the
parameters to accelerate the sampling process. We apply the proposed Bayesian
history-based parameterization to the Lorenz '96 model in the presence of noisy
and sparse data, similar to satellite observations, and show its capacity to
predict skillful forecasts of the resolved variables while returning
trustworthy uncertainty quantifications for different sources of error. This
approach paves the way for the use of Bayesian approaches for closure problems.
Related papers
- Total Uncertainty Quantification in Inverse PDE Solutions Obtained with Reduced-Order Deep Learning Surrogate Models [50.90868087591973]
We propose an approximate Bayesian method for quantifying the total uncertainty in inverse PDE solutions obtained with machine learning surrogate models.
We test the proposed framework by comparing it with the iterative ensemble smoother and deep ensembling methods for a non-linear diffusion equation.
arXiv Detail & Related papers (2024-08-20T19:06:02Z) - Estimation of spatio-temporal extremes via generative neural networks [0.0]
We provide a unified approach for analyzing spatial extremes with little available data.
By employing recent developments in generative neural networks we predict a full sample-based distribution.
We validate our method by fitting several simulated max-stable processes, showing a high accuracy of the approach.
arXiv Detail & Related papers (2024-07-11T16:57:17Z) - von Mises Quasi-Processes for Bayesian Circular Regression [57.88921637944379]
We explore a family of expressive and interpretable distributions over circle-valued random functions.
The resulting probability model has connections with continuous spin models in statistical physics.
For posterior inference, we introduce a new Stratonovich-like augmentation that lends itself to fast Markov Chain Monte Carlo sampling.
arXiv Detail & Related papers (2024-06-19T01:57:21Z) - Stochastic full waveform inversion with deep generative prior for uncertainty quantification [0.0]
Full Waveform Inversion (FWI) involves solving a nonlinear and often non-unique inverse problem.
FWI presents challenges such as local minima trapping and inadequate handling of inherent uncertainties.
We propose leveraging deep generative models as the prior distribution of geophysical parameters for Bayesian inversion.
arXiv Detail & Related papers (2024-06-07T11:44:50Z) - Learning minimal representations of stochastic processes with
variational autoencoders [52.99137594502433]
We introduce an unsupervised machine learning approach to determine the minimal set of parameters required to describe a process.
Our approach enables for the autonomous discovery of unknown parameters describing processes.
arXiv Detail & Related papers (2023-07-21T14:25:06Z) - Scalable and adaptive variational Bayes methods for Hawkes processes [4.580983642743026]
We propose a novel sparsity-inducing procedure, and derive an adaptive mean-field variational algorithm for the popular sigmoid Hawkes processes.
Our algorithm is parallelisable and therefore computationally efficient in high-dimensional setting.
arXiv Detail & Related papers (2022-12-01T05:35:32Z) - On the Effectiveness of Parameter-Efficient Fine-Tuning [79.6302606855302]
Currently, many research works propose to only fine-tune a small portion of the parameters while keeping most of the parameters shared across different tasks.
We show that all of the methods are actually sparse fine-tuned models and conduct a novel theoretical analysis of them.
Despite the effectiveness of sparsity grounded by our theory, it still remains an open problem of how to choose the tunable parameters.
arXiv Detail & Related papers (2022-11-28T17:41:48Z) - Evaluating Sensitivity to the Stick-Breaking Prior in Bayesian
Nonparametrics [85.31247588089686]
We show that variational Bayesian methods can yield sensitivities with respect to parametric and nonparametric aspects of Bayesian models.
We provide both theoretical and empirical support for our variational approach to Bayesian sensitivity analysis.
arXiv Detail & Related papers (2021-07-08T03:40:18Z) - Leveraging Global Parameters for Flow-based Neural Posterior Estimation [90.21090932619695]
Inferring the parameters of a model based on experimental observations is central to the scientific method.
A particularly challenging setting is when the model is strongly indeterminate, i.e., when distinct sets of parameters yield identical observations.
We present a method for cracking such indeterminacy by exploiting additional information conveyed by an auxiliary set of observations sharing global parameters.
arXiv Detail & Related papers (2021-02-12T12:23:13Z) - Understanding Variational Inference in Function-Space [20.940162027560408]
We highlight some advantages and limitations of employing the Kullback-Leibler divergence in this setting.
We propose (featurized) Bayesian linear regression as a benchmark for function-space' inference methods that directly measures approximation quality.
arXiv Detail & Related papers (2020-11-18T17:42:01Z) - Resampling with neural networks for stochastic parameterization in
multiscale systems [0.0]
We present a machine-learning method, used for the conditional resampling of observations or reference data from a fully resolved simulation.
It is based on the probabilistic classiffcation of subsets of reference data, conditioned on macroscopic variables.
We validate our approach on the Lorenz 96 system, using two different parameter settings.
arXiv Detail & Related papers (2020-04-03T10:09:18Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.