Gaussian Mixture Reduction with Composite Transportation Divergence
- URL: http://arxiv.org/abs/2002.08410v5
- Date: Tue, 17 Oct 2023 01:47:08 GMT
- Title: Gaussian Mixture Reduction with Composite Transportation Divergence
- Authors: Qiong Zhang, Archer Gong Zhang, Jiahua Chen
- Abstract summary: We propose a novel optimization-based GMR method based on composite transportation divergence (CTD)
We develop a majorization-minimization algorithm for computing the reduced mixture and establish its theoretical convergence.
Our unified framework empowers users to select the most appropriate cost function in CTD to achieve superior performance.
- Score: 15.687740538194413
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Gaussian mixtures are widely used for approximating density functions in
various applications such as density estimation, belief propagation, and
Bayesian filtering. These applications often utilize Gaussian mixtures as
initial approximations that are updated recursively. A key challenge in these
recursive processes stems from the exponential increase in the mixture's order,
resulting in intractable inference. To overcome the difficulty, the Gaussian
mixture reduction (GMR), which approximates a high order Gaussian mixture by
one with a lower order, can be used. Although existing clustering-based methods
are known for their satisfactory performance and computational efficiency,
their convergence properties and optimal targets remain unknown. In this paper,
we propose a novel optimization-based GMR method based on composite
transportation divergence (CTD). We develop a majorization-minimization
algorithm for computing the reduced mixture and establish its theoretical
convergence under general conditions. Furthermore, we demonstrate that many
existing clustering-based methods are special cases of ours, effectively
bridging the gap between optimization-based and clustering-based techniques.
Our unified framework empowers users to select the most appropriate cost
function in CTD to achieve superior performance in their specific applications.
Through extensive empirical experiments, we demonstrate the efficiency and
effectiveness of our proposed method, showcasing its potential in various
domains.
Related papers
- Stochastic interior-point methods for smooth conic optimization with applications [3.294420397461204]
We introduce an interior-point method for general conic optimization, along with four novel SIPM variants.
Under underdeveloped assumptions, we establish the global convergence rates of our proposed SIPMs.
Experiments on robust linear regression, multi-task relationship learning, and clustering data streams demonstrate the effectiveness of our approach.
arXiv Detail & Related papers (2024-12-17T15:06:44Z) - Vector Optimization with Gaussian Process Bandits [7.049738935364297]
Learning problems in which multiple objectives must be considered simultaneously often arise in various fields, including engineering, drug design, and environmental management.
Traditional methods for dealing with multiple black-box objective functions have limitations in incorporating objective preferences and exploring the solution space accordingly.
We propose Vector Optimization with Gaussian Process (VOGP), a probably approximately correct adaptive elimination algorithm that performs black-box vector optimization using Gaussian process bandits.
arXiv Detail & Related papers (2024-12-03T14:47:46Z) - Posterior Contraction Rates for Mat\'ern Gaussian Processes on
Riemannian Manifolds [51.68005047958965]
We show that intrinsic Gaussian processes can achieve better performance in practice.
Our work shows that finer-grained analyses are needed to distinguish between different levels of data-efficiency.
arXiv Detail & Related papers (2023-09-19T20:30:58Z) - A New Probabilistic Distance Metric With Application In Gaussian Mixture
Reduction [25.07521362926412]
This paper presents a new distance metric to compare two continuous probability density functions.
The main advantage of this metric is that it can provide an analytic, closed-form expression for a mixture of Gaussian distributions.
arXiv Detail & Related papers (2023-06-12T17:50:09Z) - Compound Batch Normalization for Long-tailed Image Classification [77.42829178064807]
We propose a compound batch normalization method based on a Gaussian mixture.
It can model the feature space more comprehensively and reduce the dominance of head classes.
The proposed method outperforms existing methods on long-tailed image classification.
arXiv Detail & Related papers (2022-12-02T07:31:39Z) - Tree ensemble kernels for Bayesian optimization with known constraints
over mixed-feature spaces [54.58348769621782]
Tree ensembles can be well-suited for black-box optimization tasks such as algorithm tuning and neural architecture search.
Two well-known challenges in using tree ensembles for black-box optimization are (i) effectively quantifying model uncertainty for exploration and (ii) optimizing over the piece-wise constant acquisition function.
Our framework performs as well as state-of-the-art methods for unconstrained black-box optimization over continuous/discrete features and outperforms competing methods for problems combining mixed-variable feature spaces and known input constraints.
arXiv Detail & Related papers (2022-07-02T16:59:37Z) - Scalable Variational Gaussian Processes via Harmonic Kernel
Decomposition [54.07797071198249]
We introduce a new scalable variational Gaussian process approximation which provides a high fidelity approximation while retaining general applicability.
We demonstrate that, on a range of regression and classification problems, our approach can exploit input space symmetries such as translations and reflections.
Notably, our approach achieves state-of-the-art results on CIFAR-10 among pure GP models.
arXiv Detail & Related papers (2021-06-10T18:17:57Z) - Plug-And-Play Learned Gaussian-mixture Approximate Message Passing [71.74028918819046]
We propose a plug-and-play compressed sensing (CS) recovery algorithm suitable for any i.i.d. source prior.
Our algorithm builds upon Borgerding's learned AMP (LAMP), yet significantly improves it by adopting a universal denoising function within the algorithm.
Numerical evaluation shows that the L-GM-AMP algorithm achieves state-of-the-art performance without any knowledge of the source prior.
arXiv Detail & Related papers (2020-11-18T16:40:45Z) - Global Optimization of Gaussian processes [52.77024349608834]
We propose a reduced-space formulation with trained Gaussian processes trained on few data points.
The approach also leads to significantly smaller and computationally cheaper sub solver for lower bounding.
In total, we reduce time convergence by orders of orders of the proposed method.
arXiv Detail & Related papers (2020-05-21T20:59:11Z) - A fast and efficient Modal EM algorithm for Gaussian mixtures [0.0]
In the modal approach to clustering, clusters are defined as the local maxima of the underlying probability density function.
The Modal EM algorithm is an iterative procedure that can identify the local maxima of any density function.
arXiv Detail & Related papers (2020-02-10T08:34:16Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.