discretize_distributions: Efficient Quantization of Gaussian Mixtures with Guarantees in Wasserstein Distance
- URL: http://arxiv.org/abs/2511.15854v1
- Date: Wed, 19 Nov 2025 20:23:11 GMT
- Title: discretize_distributions: Efficient Quantization of Gaussian Mixtures with Guarantees in Wasserstein Distance
- Authors: Steven Adams, Elize Alwash, Luca Laurenti,
- Abstract summary: discretize_distributions is a Python package that constructs discrete approximations of Gaussian mixture distributions.<n>We show that discretize_distributions produces accurate approximations at low computational cost.
- Score: 5.020978032396304
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We present discretize_distributions, a Python package that efficiently constructs discrete approximations of Gaussian mixture distributions and provides guarantees on the approximation error in Wasserstein distance. The package implements state-of-the-art quantization methods for Gaussian mixture models and extends them to improve scalability. It further integrates complementary quantization strategies such as sigma-point methods and provides a modular interface that supports custom schemes and integration into control and verification pipelines for cyber-physical systems. We benchmark the package on various examples, including high-dimensional, large, and degenerate Gaussian mixtures, and demonstrate that discretize_distributions produces accurate approximations at low computational cost.
Related papers
- Wasserstein Regression as a Variational Approximation of Probabilistic Trajectories through the Bernstein Basis [41.99844472131922]
Existing approaches often ignore the geometry of the probability space or are computationally expensive.<n>A new method is proposed that combines the parameterization of probability trajectories using a Bernstein basis and the minimization of the Wasserstein distance between distributions.<n>The developed approach combines geometric accuracy, computational practicality, and interpretability.
arXiv Detail & Related papers (2025-10-30T15:36:39Z) - Mixtures Closest to a Given Measure: A Semidefinite Programming Approach [1.7969777786551424]
We study the problem of approximating a target measure, available only through finitely many of its moments.<n>Unlike many existing approaches, the parameter set is not assumed to be finite.<n>We present an application to clustering, where our framework serves as a stand-alone method or as a preprocessing step.
arXiv Detail & Related papers (2025-09-26T19:51:21Z) - Steering Large Agent Populations using Mean-Field Schrodinger Bridges with Gaussian Mixture Models [13.03355083378673]
Mean-Field Schrodinger Bridge (MFSB) problem is an optimization problem aiming to find the minimum effort control policy.<n>In the context of multi-agent control, the objective is to control the configuration of a swarm of identical, interacting cooperative agents.
arXiv Detail & Related papers (2025-03-31T04:01:04Z) - Weighted quantization using MMD: From mean field to mean shift via gradient flows [5.216151302783165]
We show that a Wasserstein-Fisher-Rao gradient flow is well-suited for designing quantizations optimal under MMD.<n>We derive a new fixed-point algorithm called mean shift interacting particles (MSIP)<n>Our unification of gradient flows, mean shift, and MMD-optimal quantization yields algorithms more robust than state-of-the-art methods.
arXiv Detail & Related papers (2025-02-14T23:13:20Z) - Closed-form Filtering for Non-linear Systems [83.91296397912218]
We propose a new class of filters based on Gaussian PSD Models, which offer several advantages in terms of density approximation and computational efficiency.
We show that filtering can be efficiently performed in closed form when transitions and observations are Gaussian PSD Models.
Our proposed estimator enjoys strong theoretical guarantees, with estimation error that depends on the quality of the approximation and is adaptive to the regularity of the transition probabilities.
arXiv Detail & Related papers (2024-02-15T08:51:49Z) - Distributed Markov Chain Monte Carlo Sampling based on the Alternating
Direction Method of Multipliers [143.6249073384419]
In this paper, we propose a distributed sampling scheme based on the alternating direction method of multipliers.
We provide both theoretical guarantees of our algorithm's convergence and experimental evidence of its superiority to the state-of-the-art.
In simulation, we deploy our algorithm on linear and logistic regression tasks and illustrate its fast convergence compared to existing gradient-based methods.
arXiv Detail & Related papers (2024-01-29T02:08:40Z) - Variational Gaussian filtering via Wasserstein gradient flows [6.023171219551961]
We present a novel approach to approximate Gaussian and mixture-of-Gaussians filtering.
Our method relies on a variational approximation via a gradient-flow representation.
arXiv Detail & Related papers (2023-03-11T12:22:35Z) - Numerically Stable Sparse Gaussian Processes via Minimum Separation
using Cover Trees [57.67528738886731]
We study the numerical stability of scalable sparse approximations based on inducing points.
For low-dimensional tasks such as geospatial modeling, we propose an automated method for computing inducing points satisfying these conditions.
arXiv Detail & Related papers (2022-10-14T15:20:17Z) - A Stochastic Newton Algorithm for Distributed Convex Optimization [62.20732134991661]
We analyze a Newton algorithm for homogeneous distributed convex optimization, where each machine can calculate gradients of the same population objective.
We show that our method can reduce the number, and frequency, of required communication rounds compared to existing methods without hurting performance.
arXiv Detail & Related papers (2021-10-07T17:51:10Z) - Scalable Variational Gaussian Processes via Harmonic Kernel
Decomposition [54.07797071198249]
We introduce a new scalable variational Gaussian process approximation which provides a high fidelity approximation while retaining general applicability.
We demonstrate that, on a range of regression and classification problems, our approach can exploit input space symmetries such as translations and reflections.
Notably, our approach achieves state-of-the-art results on CIFAR-10 among pure GP models.
arXiv Detail & Related papers (2021-06-10T18:17:57Z) - Large-Scale Wasserstein Gradient Flows [84.73670288608025]
We introduce a scalable scheme to approximate Wasserstein gradient flows.
Our approach relies on input neural networks (ICNNs) to discretize the JKO steps.
As a result, we can sample from the measure at each step of the gradient diffusion and compute its density.
arXiv Detail & Related papers (2021-06-01T19:21:48Z) - Distributed Sketching Methods for Privacy Preserving Regression [54.51566432934556]
We leverage randomized sketches for reducing the problem dimensions as well as preserving privacy and improving straggler resilience in asynchronous distributed systems.
We derive novel approximation guarantees for classical sketching methods and analyze the accuracy of parameter averaging for distributed sketches.
We illustrate the performance of distributed sketches in a serverless computing platform with large scale experiments.
arXiv Detail & Related papers (2020-02-16T08:35:48Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.