Fast sampling and model selection for Bayesian mixture models
- URL: http://arxiv.org/abs/2501.07668v1
- Date: Mon, 13 Jan 2025 19:58:37 GMT
- Title: Fast sampling and model selection for Bayesian mixture models
- Authors: M. E. J. Newman,
- Abstract summary: We describe two Monte Carlo algorithms for sampling from the integrated posterior distributions of a range of Bayesian mixture models.
The first algorithm is a traditional collapsed Gibbs sampler, albeit with an unusual move-set.
The second builds on the first, adding rejection-free sampling from the prior over component assignments.
- Score: 0.6345523830122168
- License:
- Abstract: We describe two Monte Carlo algorithms for sampling from the integrated posterior distributions of a range of Bayesian mixture models. Both algorithms allow us to directly sample not only the assignment of observations to components but also the number of components, thereby fitting the model and performing model selection over the number of components in a single computation. The first algorithm is a traditional collapsed Gibbs sampler, albeit with an unusual move-set; the second builds on the first, adding rejection-free sampling from the prior over component assignments, to create an algorithm that has excellent mixing time in typical applications and outperforms current state-of-the-art methods, in some cases by a wide margin. We demonstrate our methods with a selection of applications to latent class analysis.
Related papers
- Unified Convergence Analysis for Score-Based Diffusion Models with Deterministic Samplers [49.1574468325115]
We introduce a unified convergence analysis framework for deterministic samplers.
Our framework achieves iteration complexity of $tilde O(d2/epsilon)$.
We also provide a detailed analysis of Denoising Implicit Diffusion Models (DDIM)-type samplers.
arXiv Detail & Related papers (2024-10-18T07:37:36Z) - You Only Accept Samples Once: Fast, Self-Correcting Stochastic Variational Inference [0.0]
YOASOVI is an algorithm for performing fast, self-correcting intuition optimization for Variational Inference (VI) on large Bayesian heirarchical models.
To accomplish this, we take advantage of available information on the objective function used for VI at each iteration and replace regular Monte Carlo sampling with acceptance sampling.
arXiv Detail & Related papers (2024-06-05T01:28:53Z) - Fusion of Gaussian Processes Predictions with Monte Carlo Sampling [61.31380086717422]
In science and engineering, we often work with models designed for accurate prediction of variables of interest.
Recognizing that these models are approximations of reality, it becomes desirable to apply multiple models to the same data and integrate their outcomes.
arXiv Detail & Related papers (2024-03-03T04:21:21Z) - Dimension-free mixing times of Gibbs samplers for Bayesian hierarchical
models [0.0]
We analyse the behaviour of total variation mixing times of Gibbs samplers targeting hierarchical models.
We obtain convergence results under random data-generating assumptions for a broad class of two-level models.
arXiv Detail & Related papers (2023-04-14T08:30:40Z) - Sampling from Arbitrary Functions via PSD Models [55.41644538483948]
We take a two-step approach by first modeling the probability distribution and then sampling from that model.
We show that these models can approximate a large class of densities concisely using few evaluations, and present a simple algorithm to effectively sample from these models.
arXiv Detail & Related papers (2021-10-20T12:25:22Z) - Direct sampling of projected entangled-pair states [0.0]
Variational Monte Carlo studies employing projected entangled-pair states (PEPS) have recently shown that they can provide answers on long-standing questions.
We propose a sampling algorithm that generates independent samples from a PEPS, bypassing all problems related to finite autocorrelation times.
arXiv Detail & Related papers (2021-09-15T15:09:20Z) - Finding Geometric Models by Clustering in the Consensus Space [61.65661010039768]
We propose a new algorithm for finding an unknown number of geometric models, e.g., homographies.
We present a number of applications where the use of multiple geometric models improves accuracy.
These include pose estimation from multiple generalized homographies; trajectory estimation of fast-moving objects.
arXiv Detail & Related papers (2021-03-25T14:35:07Z) - Efficiently Sampling Functions from Gaussian Process Posteriors [76.94808614373609]
We propose an easy-to-use and general-purpose approach for fast posterior sampling.
We demonstrate how decoupled sample paths accurately represent Gaussian process posteriors at a fraction of the usual cost.
arXiv Detail & Related papers (2020-02-21T14:03:16Z) - Learning Gaussian Graphical Models via Multiplicative Weights [54.252053139374205]
We adapt an algorithm of Klivans and Meka based on the method of multiplicative weight updates.
The algorithm enjoys a sample complexity bound that is qualitatively similar to others in the literature.
It has a low runtime $O(mp2)$ in the case of $m$ samples and $p$ nodes, and can trivially be implemented in an online manner.
arXiv Detail & Related papers (2020-02-20T10:50:58Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.