Accelerating MCMC algorithms through Bayesian Deep Networks
- URL: http://arxiv.org/abs/2011.14276v1
- Date: Sun, 29 Nov 2020 04:29:00 GMT
- Title: Accelerating MCMC algorithms through Bayesian Deep Networks
- Authors: Hector J. Hortua, Riccardo Volpi, Dimitri Marinelli, Luigi Malago
- Abstract summary: Markov Chain Monte Carlo (MCMC) algorithms are commonly used for their versatility in sampling from complicated probability distributions.
As the dimension of the distribution gets larger, the computational costs for a satisfactory exploration of the sampling space become challenging.
We show an alternative way of performing adaptive MCMC, by using the outcome of Bayesian Neural Networks as the initial proposal for the Markov Chain.
- Score: 7.054093620465401
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Markov Chain Monte Carlo (MCMC) algorithms are commonly used for their
versatility in sampling from complicated probability distributions. However, as
the dimension of the distribution gets larger, the computational costs for a
satisfactory exploration of the sampling space become challenging. Adaptive
MCMC methods employing a choice of proposal distribution can address this issue
speeding up the convergence. In this paper we show an alternative way of
performing adaptive MCMC, by using the outcome of Bayesian Neural Networks as
the initial proposal for the Markov Chain. This combined approach increases the
acceptance rate in the Metropolis-Hasting algorithm and accelerate the
convergence of the MCMC while reaching the same final accuracy. Finally, we
demonstrate the main advantages of this approach by constraining the
cosmological parameters directly from Cosmic Microwave Background maps.
Related papers
- Distributed Markov Chain Monte Carlo Sampling based on the Alternating
Direction Method of Multipliers [143.6249073384419]
In this paper, we propose a distributed sampling scheme based on the alternating direction method of multipliers.
We provide both theoretical guarantees of our algorithm's convergence and experimental evidence of its superiority to the state-of-the-art.
In simulation, we deploy our algorithm on linear and logistic regression tasks and illustrate its fast convergence compared to existing gradient-based methods.
arXiv Detail & Related papers (2024-01-29T02:08:40Z) - Reverse Diffusion Monte Carlo [19.35592726471155]
We propose a novel Monte Carlo sampling algorithm called reverse diffusion Monte Carlo (rdMC)
rdMC is distinct from the Markov chain Monte Carlo (MCMC) methods.
arXiv Detail & Related papers (2023-07-05T05:42:03Z) - Bayesian Decision Trees Inspired from Evolutionary Algorithms [64.80360020499555]
We propose a replacement of the Markov Chain Monte Carlo (MCMC) with an inherently parallel algorithm, the Sequential Monte Carlo (SMC)
Experiments show that SMC combined with the Evolutionary Algorithms (EA) can produce more accurate results compared to MCMC in 100 times fewer iterations.
arXiv Detail & Related papers (2023-05-30T06:17:35Z) - Importance is Important: Generalized Markov Chain Importance Sampling Methods [4.611170084430822]
We show that for any multiple-try Metropolis algorithm, one can always accept the proposal and evaluate the importance weight that is needed to correct for the bias without extra computational cost.
We propose an alternative MCMC sampler on discrete spaces that is also outside the Metropolis--Hastings framework.
arXiv Detail & Related papers (2023-04-13T04:04:09Z) - What Are Bayesian Neural Network Posteriors Really Like? [63.950151520585024]
We show that Hamiltonian Monte Carlo can achieve significant performance gains over standard and deep ensembles.
We also show that deep distributions are similarly close to HMC as standard SGLD, and closer than standard variational inference.
arXiv Detail & Related papers (2021-04-29T15:38:46Z) - Sampling in Combinatorial Spaces with SurVAE Flow Augmented MCMC [83.48593305367523]
Hybrid Monte Carlo is a powerful Markov Chain Monte Carlo method for sampling from complex continuous distributions.
We introduce a new approach based on augmenting Monte Carlo methods with SurVAE Flows to sample from discrete distributions.
We demonstrate the efficacy of our algorithm on a range of examples from statistics, computational physics and machine learning, and observe improvements compared to alternative algorithms.
arXiv Detail & Related papers (2021-02-04T02:21:08Z) - MCMC-Interactive Variational Inference [56.58416764959414]
We propose MCMC-interactive variational inference (MIVI) to estimate the posterior in a time constrained manner.
MIVI takes advantage of the complementary properties of variational inference and MCMC to encourage mutual improvement.
Experiments show that MIVI not only accurately approximates the posteriors but also facilitates designs of gradient MCMC and Gibbs sampling transitions.
arXiv Detail & Related papers (2020-10-02T17:43:20Z) - Non-convex Learning via Replica Exchange Stochastic Gradient MCMC [25.47669573608621]
We propose an adaptive replica exchange SGMCMC (reSGMCMC) to automatically correct the bias and study the corresponding properties.
Empirically, we test the algorithm through extensive experiments on various setups and obtain the results.
arXiv Detail & Related papers (2020-08-12T15:02:59Z) - Involutive MCMC: a Unifying Framework [64.46316409766764]
We describe a wide range of MCMC algorithms in terms of iMCMC.
We formulate a number of "tricks" which one can use as design principles for developing new MCMC algorithms.
We demonstrate the latter with two examples where we transform known reversible MCMC algorithms into more efficient irreversible ones.
arXiv Detail & Related papers (2020-06-30T10:21:42Z) - MetFlow: A New Efficient Method for Bridging the Gap between Markov
Chain Monte Carlo and Variational Inference [20.312106392307406]
We propose a new computationally efficient method to combine Variational Inference (VI) with Markov Chain Monte Carlo (MCMC)
This approach can be used with generic MCMC kernels, but is especially well suited to textitMetFlow, a novel family of MCMC algorithms we introduce.
arXiv Detail & Related papers (2020-02-27T16:50:30Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.