Optimal Transport for Kernel Gaussian Mixture Models
- URL: http://arxiv.org/abs/2310.18586v1
- Date: Sat, 28 Oct 2023 04:31:49 GMT
- Title: Optimal Transport for Kernel Gaussian Mixture Models
- Authors: Jung Hun Oh, Rena Elkin, Anish Kumar Simhal, Jiening Zhu, Joseph O
Deasy, Allen Tannenbaum
- Abstract summary: Wasserstein distance from optimal mass transport is a powerful mathematical tool.
We propose a Wasserstein-type metric to compute the distance between two Gaussian mixtures in a kernel Hilbert space.
- Score: 1.631115063641726
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: The Wasserstein distance from optimal mass transport (OMT) is a powerful
mathematical tool with numerous applications that provides a natural measure of
the distance between two probability distributions. Several methods to
incorporate OMT into widely used probabilistic models, such as Gaussian or
Gaussian mixture, have been developed to enhance the capability of modeling
complex multimodal densities of real datasets. However, very few studies have
explored the OMT problems in a reproducing kernel Hilbert space (RKHS), wherein
the kernel trick is utilized to avoid the need to explicitly map input data
into a high-dimensional feature space. In the current study, we propose a
Wasserstein-type metric to compute the distance between two Gaussian mixtures
in a RKHS via the kernel trick, i.e., kernel Gaussian mixture models.
Related papers
- Dynamical Measure Transport and Neural PDE Solvers for Sampling [77.38204731939273]
We tackle the task of sampling from a probability density as transporting a tractable density function to the target.
We employ physics-informed neural networks (PINNs) to approximate the respective partial differential equations (PDEs) solutions.
PINNs allow for simulation- and discretization-free optimization and can be trained very efficiently.
arXiv Detail & Related papers (2024-07-10T17:39:50Z) - Weighted Riesz Particles [0.0]
We consider the target distribution as a mapping where the infinite-dimensional space of the parameters consists of a number of deterministic submanifolds.
We study the properties of the point, called Riesz, and embed it into sequential MCMC.
We find that there will be higher acceptance rates with fewer evaluations.
arXiv Detail & Related papers (2023-12-01T14:36:46Z) - Gaussian Mixture Solvers for Diffusion Models [84.83349474361204]
We introduce a novel class of SDE-based solvers called GMS for diffusion models.
Our solver outperforms numerous SDE-based solvers in terms of sample quality in image generation and stroke-based synthesis.
arXiv Detail & Related papers (2023-11-02T02:05:38Z) - Scaling Riemannian Diffusion Models [68.52820280448991]
We show that our method enables us to scale to high dimensional tasks on nontrivial manifold.
We model QCD densities on $SU(n)$ lattices and contrastively learned embeddings on high dimensional hyperspheres.
arXiv Detail & Related papers (2023-10-30T21:27:53Z) - Monte Carlo Neural PDE Solver for Learning PDEs via Probabilistic Representation [59.45669299295436]
We propose a Monte Carlo PDE solver for training unsupervised neural solvers.
We use the PDEs' probabilistic representation, which regards macroscopic phenomena as ensembles of random particles.
Our experiments on convection-diffusion, Allen-Cahn, and Navier-Stokes equations demonstrate significant improvements in accuracy and efficiency.
arXiv Detail & Related papers (2023-02-10T08:05:19Z) - Nonstationary multi-output Gaussian processes via harmonizable spectral
mixtures [0.0]
We develop a nonstationary extension of the Multi-output Spectral Mixture kernel (MOSM) arXiv:1709.01298.
The proposed harmonizable kernels automatically identify a possible nonstationary behaviour meaning that practitioners do not need to choose between stationary or non-stationary kernels.
arXiv Detail & Related papers (2022-02-18T15:00:08Z) - The Schr\"odinger Bridge between Gaussian Measures has a Closed Form [101.79851806388699]
We focus on the dynamic formulation of OT, also known as the Schr"odinger bridge (SB) problem.
In this paper, we provide closed-form expressions for SBs between Gaussian measures.
arXiv Detail & Related papers (2022-02-11T15:59:01Z) - A Note on Optimizing Distributions using Kernel Mean Embeddings [94.96262888797257]
Kernel mean embeddings represent probability measures by their infinite-dimensional mean embeddings in a reproducing kernel Hilbert space.
We show that when the kernel is characteristic, distributions with a kernel sum-of-squares density are dense.
We provide algorithms to optimize such distributions in the finite-sample setting.
arXiv Detail & Related papers (2021-06-18T08:33:45Z) - Accurate and efficient Simulation of very high-dimensional Neural Mass
Models with distributed-delay Connectome Tensors [0.23453441553817037]
This paper introduces methods that efficiently integrates any high-dimensional Neural Mass Models (NMMs) specified by two essential components.
The first is the set of nonlinear Random Differential Equations of the dynamics of each neural mass.
The second is the highly sparse three-dimensional Connectome (CT) that encodes the strength of the connections and the delays of information transfer along the axons of each connection.
arXiv Detail & Related papers (2020-09-16T05:55:17Z) - Schoenberg-Rao distances: Entropy-based and geometry-aware statistical
Hilbert distances [12.729120803225065]
We study a class of statistical Hilbert distances that we term the Schoenberg-Rao distances.
We derive novel closed-form distances between mixtures of Gaussian distributions.
Our method constitutes a practical alternative to Wasserstein distances and we illustrate its efficiency on a broad range of machine learning tasks.
arXiv Detail & Related papers (2020-02-19T18:48:33Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.