Multilevel Gibbs Sampling for Bayesian Regression
- URL: http://arxiv.org/abs/2009.12132v1
- Date: Fri, 25 Sep 2020 11:18:17 GMT
- Title: Multilevel Gibbs Sampling for Bayesian Regression
- Authors: Joris Tavernier, Jaak Simm, Adam Arany, Karl Meerbergen, Yves Moreau
- Abstract summary: The level hierarchy of data matrices is created by clustering the features and/or samples of data matrices.
The use of correlated samples is investigated for variance reduction to improve the convergence of the Markov Chain.
Speed-up is achieved for almost all of them without significant loss in predictive performance.
- Score: 6.2997667081978825
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Bayesian regression remains a simple but effective tool based on Bayesian
inference techniques. For large-scale applications, with complicated posterior
distributions, Markov Chain Monte Carlo methods are applied. To improve the
well-known computational burden of Markov Chain Monte Carlo approach for
Bayesian regression, we developed a multilevel Gibbs sampler for Bayesian
regression of linear mixed models. The level hierarchy of data matrices is
created by clustering the features and/or samples of data matrices.
Additionally, the use of correlated samples is investigated for variance
reduction to improve the convergence of the Markov Chain. Testing on a diverse
set of data sets, speed-up is achieved for almost all of them without
significant loss in predictive performance.
Related papers
- Low-rank Bayesian matrix completion via geodesic Hamiltonian Monte Carlo on Stiefel manifolds [0.18416014644193066]
We present a new sampling-based approach for enabling efficient computation of low-rank Bayesian matrix completion.
We show that our approach resolves the sampling difficulties encountered by standard Gibbs samplers for the common two-matrix factorization used in matrix completion.
Numerical examples demonstrate superior sampling performance, including better mixing and faster convergence to a stationary distribution.
arXiv Detail & Related papers (2024-10-27T03:12:53Z) - Regression-aware Inference with LLMs [52.764328080398805]
We show that an inference strategy can be sub-optimal for common regression and scoring evaluation metrics.
We propose alternate inference strategies that estimate the Bayes-optimal solution for regression and scoring metrics in closed-form from sampled responses.
arXiv Detail & Related papers (2024-03-07T03:24:34Z) - Gradient-flow adaptive importance sampling for Bayesian leave one out cross-validation with application to sigmoidal classification models [0.9895793818721335]
gradient-flow-guided adaptive importance sampling (IS) transformations for stabilizing Monte-Carlo approximations of leave-one-out predictions.
We derive closed-form exact formulae for Jacobian determinants with respect to the model Hessian.
arXiv Detail & Related papers (2024-02-13T01:03:39Z) - Probabilistic Unrolling: Scalable, Inverse-Free Maximum Likelihood
Estimation for Latent Gaussian Models [69.22568644711113]
We introduce probabilistic unrolling, a method that combines Monte Carlo sampling with iterative linear solvers to circumvent matrix inversions.
Our theoretical analyses reveal that unrolling and backpropagation through the iterations of the solver can accelerate gradient estimation for maximum likelihood estimation.
In experiments on simulated and real data, we demonstrate that probabilistic unrolling learns latent Gaussian models up to an order of magnitude faster than gradient EM, with minimal losses in model performance.
arXiv Detail & Related papers (2023-06-05T21:08:34Z) - Dimension-free mixing times of Gibbs samplers for Bayesian hierarchical
models [0.0]
We analyse the behaviour of total variation mixing times of Gibbs samplers targeting hierarchical models.
We obtain convergence results under random data-generating assumptions for a broad class of two-level models.
arXiv Detail & Related papers (2023-04-14T08:30:40Z) - Langevin Monte Carlo for Contextual Bandits [72.00524614312002]
Langevin Monte Carlo Thompson Sampling (LMC-TS) is proposed to directly sample from the posterior distribution in contextual bandits.
We prove that the proposed algorithm achieves the same sublinear regret bound as the best Thompson sampling algorithms for a special case of contextual bandits.
arXiv Detail & Related papers (2022-06-22T17:58:23Z) - A fast asynchronous MCMC sampler for sparse Bayesian inference [10.535140830570256]
We propose a very fast approximate Markov Chain Monte Carlo (MCMC) sampling framework that is applicable to a large class of sparse Bayesian inference problems.
We show that in high-dimensional linear regression problems, the Markov chain generated by the proposed algorithm admits an invariant distribution that recovers correctly the main signal.
arXiv Detail & Related papers (2021-08-14T02:20:49Z) - T-LoHo: A Bayesian Regularization Model for Structured Sparsity and
Smoothness on Graphs [0.0]
In graph-structured data, structured sparsity and smoothness tend to cluster together.
We propose a new prior for high dimensional parameters with graphical relations.
We use it to detect structured sparsity and smoothness simultaneously.
arXiv Detail & Related papers (2021-07-06T10:10:03Z) - Solving weakly supervised regression problem using low-rank manifold
regularization [77.34726150561087]
We solve a weakly supervised regression problem.
Under "weakly" we understand that for some training points the labels are known, for some unknown, and for others uncertain due to the presence of random noise or other reasons such as lack of resources.
In the numerical section, we applied the suggested method to artificial and real datasets using Monte-Carlo modeling.
arXiv Detail & Related papers (2021-04-13T23:21:01Z) - Efficiently Sampling Functions from Gaussian Process Posteriors [76.94808614373609]
We propose an easy-to-use and general-purpose approach for fast posterior sampling.
We demonstrate how decoupled sample paths accurately represent Gaussian process posteriors at a fraction of the usual cost.
arXiv Detail & Related papers (2020-02-21T14:03:16Z) - Particle-Gibbs Sampling For Bayesian Feature Allocation Models [77.57285768500225]
Most widely used MCMC strategies rely on an element wise Gibbs update of the feature allocation matrix.
We have developed a Gibbs sampler that can update an entire row of the feature allocation matrix in a single move.
This sampler is impractical for models with a large number of features as the computational complexity scales exponentially in the number of features.
We develop a Particle Gibbs sampler that targets the same distribution as the row wise Gibbs updates, but has computational complexity that only grows linearly in the number of features.
arXiv Detail & Related papers (2020-01-25T22:11:51Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.