Sampling via Gaussian Mixture Approximations
- URL: http://arxiv.org/abs/2509.25232v1
- Date: Thu, 25 Sep 2025 04:13:31 GMT
- Title: Sampling via Gaussian Mixture Approximations
- Authors: Yongchao Huang,
- Abstract summary: We present a family of GMA samplers for sampling unnormalised target densities.<n>We show that this optimisation-resampling scheme yields consistent approximations under mild conditions.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We present a family of \textit{Gaussian Mixture Approximation} (GMA) samplers for sampling unnormalised target densities, encompassing \textit{weights-only GMA} (W-GMA), \textit{Laplace Mixture Approximation} (LMA), \textit{expectation-maximization GMA} (EM-GMA), and further variants. GMA adopts a simple two-stage paradigm: (i) initialise a finite set of Gaussian components and draw samples from a proposal mixture; (ii) fit the mixture to the target by optimising either only the component weights or also the means and variances, via a sample-based KL divergence objective that requires only evaluations of the unnormalised density, followed by stratified resampling. The method is gradient-free, and computationally efficient: it leverages the ease of sampling from Gaussians, efficient optimisation methods (projected gradient descent, mirror descent, and EM), and the robustness of stratified resampling to produce samples faithful to the target. We show that this optimisation-resampling scheme yields consistent approximations under mild conditions, and we validate this methodology with empirical results demonstrating accuracy and speed across diverse densities.
Related papers
- Mixtures Closest to a Given Measure: A Semidefinite Programming Approach [1.7969777786551424]
We study the problem of approximating a target measure, available only through finitely many of its moments.<n>Unlike many existing approaches, the parameter set is not assumed to be finite.<n>We present an application to clustering, where our framework serves as a stand-alone method or as a preprocessing step.
arXiv Detail & Related papers (2025-09-26T19:51:21Z) - Restricted Spectral Gap Decomposition for Simulated Tempering Targeting Mixture Distributions [3.7577421880330535]
We consider simulated tempering combined with an arbitrary local chain Monte Carlo sampler.<n>We present a new decomposition theorem that provides a lower bound on the restricted spectral gap of the algorithm for sampling from mixture distributions.
arXiv Detail & Related papers (2025-05-21T03:28:55Z) - Scalable Importance Sampling in High Dimensions with Low-Rank Mixture Proposals [37.634056981112444]
Importance sampling is a Monte Carlo technique for efficiently estimating the likelihood of rare events.<n>We propose using mixtures of probabilistic principal component analyzers (MPPCA) as the parametric proposal density for importance sampling methods.<n>We validate our method on three simulated systems, demonstrating consistent gains in sample efficiency and quality of failure distribution characterization.
arXiv Detail & Related papers (2025-05-19T16:44:48Z) - End-To-End Learning of Gaussian Mixture Priors for Diffusion Sampler [15.372235873766812]
Learnable mixture priors offer improved control over exploration, adaptability to target support, and increased to counteract mode collapse.<n>Our experimental results demonstrate significant performance improvements across a diverse range of real-world and synthetic benchmark problems.
arXiv Detail & Related papers (2025-03-01T14:58:14Z) - Iterated Denoising Energy Matching for Sampling from Boltzmann Densities [109.23137009609519]
Iterated Denoising Energy Matching (iDEM)
iDEM alternates between (I) sampling regions of high model density from a diffusion-based sampler and (II) using these samples in our matching objective.
We show that the proposed approach achieves state-of-the-art performance on all metrics and trains $2-5times$ faster.
arXiv Detail & Related papers (2024-02-09T01:11:23Z) - Adaptive Annealed Importance Sampling with Constant Rate Progress [68.8204255655161]
Annealed Importance Sampling (AIS) synthesizes weighted samples from an intractable distribution.
We propose the Constant Rate AIS algorithm and its efficient implementation for $alpha$-divergences.
arXiv Detail & Related papers (2023-06-27T08:15:28Z) - Differentiating Metropolis-Hastings to Optimize Intractable Densities [51.16801956665228]
We develop an algorithm for automatic differentiation of Metropolis-Hastings samplers.
We apply gradient-based optimization to objectives expressed as expectations over intractable target densities.
arXiv Detail & Related papers (2023-06-13T17:56:02Z) - Sampling with Mollified Interaction Energy Descent [57.00583139477843]
We present a new optimization-based method for sampling called mollified interaction energy descent (MIED)
MIED minimizes a new class of energies on probability measures called mollified interaction energies (MIEs)
We show experimentally that for unconstrained sampling problems our algorithm performs on par with existing particle-based algorithms like SVGD.
arXiv Detail & Related papers (2022-10-24T16:54:18Z) - Optimization of Annealed Importance Sampling Hyperparameters [77.34726150561087]
Annealed Importance Sampling (AIS) is a popular algorithm used to estimates the intractable marginal likelihood of deep generative models.
We present a parameteric AIS process with flexible intermediary distributions and optimize the bridging distributions to use fewer number of steps for sampling.
We assess the performance of our optimized AIS for marginal likelihood estimation of deep generative models and compare it to other estimators.
arXiv Detail & Related papers (2022-09-27T07:58:25Z) - Rethinking Collaborative Metric Learning: Toward an Efficient
Alternative without Negative Sampling [156.7248383178991]
Collaborative Metric Learning (CML) paradigm has aroused wide interest in the area of recommendation systems (RS)
We find that negative sampling would lead to a biased estimation of the generalization error.
Motivated by this, we propose an efficient alternative without negative sampling for CML named textitSampling-Free Collaborative Metric Learning (SFCML)
arXiv Detail & Related papers (2022-06-23T08:50:22Z) - A Proximal Algorithm for Sampling [14.909442791255042]
We study problems associated with potentials that lack smoothness.
The potentials can be either convex or non-smooth.
Our algorithm is based on a special case of rejection sampling.
arXiv Detail & Related papers (2022-02-28T17:26:09Z) - Gaussian Mixture Estimation from Weighted Samples [9.442139459221785]
We consider estimating the parameters of a Gaussian mixture density with a given number of components best representing a given set of weighted samples.
We adopt a density interpretation of the samples by viewing them as a discrete Dirac mixture density over a continuous domain with weighted components.
An expectation-maximization method is proposed that properly considers not only the sample locations, but also the corresponding weights.
arXiv Detail & Related papers (2021-06-09T14:38:46Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.