Automatic tempered posterior distributions for Bayesian inversion
problems
- URL: http://arxiv.org/abs/2107.11614v1
- Date: Sat, 24 Jul 2021 14:06:00 GMT
- Title: Automatic tempered posterior distributions for Bayesian inversion
problems
- Authors: L. Martino, F. Llorente, E. Curbelo, J. Lopez-Santiago, J. Miguez
- Abstract summary: The technique is implemented by means of an iterative procedure, alternating sampling and optimization steps.
The noise power is also used as a tempered parameter for the posterior distribution of the the variables of interest.
A complete Bayesian study over the model parameters and the scale parameter can be also performed.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We propose a novel adaptive importance sampling scheme for Bayesian inversion
problems where the inference of the variables of interest and the power of the
data noise is split. More specifically, we consider a Bayesian analysis for the
variables of interest (i.e., the parameters of the model to invert), whereas we
employ a maximum likelihood approach for the estimation of the noise power. The
whole technique is implemented by means of an iterative procedure, alternating
sampling and optimization steps. Moreover, the noise power is also used as a
tempered parameter for the posterior distribution of the the variables of
interest. Therefore, a sequence of tempered posterior densities is generated,
where the tempered parameter is automatically selected according to the actual
estimation of the noise power. A complete Bayesian study over the model
parameters and the scale parameter can be also performed. Numerical experiments
show the benefits of the proposed approach.
Related papers
- Diffusion Models for Solving Inverse Problems via Posterior Sampling with Piecewise Guidance [52.705112811734566]
A novel diffusion-based framework is introduced for solving inverse problems using a piecewise guidance scheme.<n>The proposed method is problem-agnostic and readily adaptable to a variety of inverse problems.<n>The framework achieves a reduction in inference time of (25%) for inpainting with both random and center masks, and (23%) and (24%) for (4times) and (8times) super-resolution tasks.
arXiv Detail & Related papers (2025-07-22T19:35:14Z) - Adaptive Resampling with Bootstrap for Noisy Multi-Objective Optimization Problems [0.0]
This paper presents a resampling decision function that incorporates the nature of the optimization problem by using bootstrapping and the probability of dominance.
The efficiency of this resampling approach is demonstrated by applying it in the NSGA-II algorithm with a sequential resampling procedure under multiple noise variations.
arXiv Detail & Related papers (2025-03-27T13:32:42Z) - Adaptive posterior distributions for uncertainty analysis of covariance matrices in Bayesian inversion problems for multioutput signals [0.0]
We address the problem of performing Bayesian inference for the parameters of a nonlinear multi-output model.
The variables of interest are split in two blocks and the inference takes advantage of known analytical optimization formulas.
arXiv Detail & Related papers (2025-01-02T09:01:09Z) - Accelerated zero-order SGD under high-order smoothness and overparameterized regime [79.85163929026146]
We present a novel gradient-free algorithm to solve convex optimization problems.
Such problems are encountered in medicine, physics, and machine learning.
We provide convergence guarantees for the proposed algorithm under both types of noise.
arXiv Detail & Related papers (2024-11-21T10:26:17Z) - Adaptive Online Bayesian Estimation of Frequency Distributions with Local Differential Privacy [0.4604003661048266]
We propose a novel approach for the adaptive and online estimation of the frequency distribution of a finite number of categories under the local differential privacy (LDP) framework.
The proposed algorithm performs Bayesian parameter estimation via posterior sampling and adapts the randomization mechanism for LDP based on the obtained posterior samples.
We provide a theoretical analysis showing that (i) the posterior distribution targeted by the algorithm converges to the true parameter even for approximate posterior sampling, and (ii) the algorithm selects the optimal subset with high probability if posterior sampling is performed exactly.
arXiv Detail & Related papers (2024-05-11T13:59:52Z) - Optimization of Annealed Importance Sampling Hyperparameters [77.34726150561087]
Annealed Importance Sampling (AIS) is a popular algorithm used to estimates the intractable marginal likelihood of deep generative models.
We present a parameteric AIS process with flexible intermediary distributions and optimize the bridging distributions to use fewer number of steps for sampling.
We assess the performance of our optimized AIS for marginal likelihood estimation of deep generative models and compare it to other estimators.
arXiv Detail & Related papers (2022-09-27T07:58:25Z) - Noise Estimation in Gaussian Process Regression [1.5002438468152661]
The presented method can be used to estimate the variance of the correlated error, and the variance of the noise based on maximizing a marginal likelihood function.
We demonstrate the computational advantages and robustness of the presented approach compared to traditional parameter optimization.
arXiv Detail & Related papers (2022-06-20T19:36:03Z) - Variational Refinement for Importance Sampling Using the Forward
Kullback-Leibler Divergence [77.06203118175335]
Variational Inference (VI) is a popular alternative to exact sampling in Bayesian inference.
Importance sampling (IS) is often used to fine-tune and de-bias the estimates of approximate Bayesian inference procedures.
We propose a novel combination of optimization and sampling techniques for approximate Bayesian inference.
arXiv Detail & Related papers (2021-06-30T11:00:24Z) - Sampling-free Variational Inference for Neural Networks with
Multiplicative Activation Noise [51.080620762639434]
We propose a more efficient parameterization of the posterior approximation for sampling-free variational inference.
Our approach yields competitive results for standard regression problems and scales well to large-scale image classification tasks.
arXiv Detail & Related papers (2021-03-15T16:16:18Z) - Leveraging Global Parameters for Flow-based Neural Posterior Estimation [90.21090932619695]
Inferring the parameters of a model based on experimental observations is central to the scientific method.
A particularly challenging setting is when the model is strongly indeterminate, i.e., when distinct sets of parameters yield identical observations.
We present a method for cracking such indeterminacy by exploiting additional information conveyed by an auxiliary set of observations sharing global parameters.
arXiv Detail & Related papers (2021-02-12T12:23:13Z) - Variable selection for Gaussian process regression through a sparse
projection [0.802904964931021]
This paper presents a new variable selection approach integrated with Gaussian process (GP) regression.
The choice of tuning parameters and the accuracy of the estimation are evaluated with the simulation some chosen benchmark approaches.
arXiv Detail & Related papers (2020-08-25T01:06:10Z) - Automated data-driven selection of the hyperparameters for
Total-Variation based texture segmentation [12.093824308505216]
Generalized Stein Unbiased Risk Estimator is revisited to handle correlated Gaussian noise.
Problem formulation naturally entails inter-scale and spatially correlated noise.
arXiv Detail & Related papers (2020-04-20T16:43:09Z) - SUMO: Unbiased Estimation of Log Marginal Probability for Latent
Variable Models [80.22609163316459]
We introduce an unbiased estimator of the log marginal likelihood and its gradients for latent variable models based on randomized truncation of infinite series.
We show that models trained using our estimator give better test-set likelihoods than a standard importance-sampling based approach for the same average computational cost.
arXiv Detail & Related papers (2020-04-01T11:49:30Z) - Generalized Gumbel-Softmax Gradient Estimator for Various Discrete
Random Variables [16.643346012854156]
Esting the gradients of nodes is one of the crucial research questions in the deep generative modeling community.
This paper proposes a general version of the Gumbel-Softmax estimator with continuous relaxation.
arXiv Detail & Related papers (2020-03-04T01:13:15Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.