SVGD as a kernelized Wasserstein gradient flow of the chi-squared
divergence
- URL: http://arxiv.org/abs/2006.02509v1
- Date: Wed, 3 Jun 2020 20:20:21 GMT
- Title: SVGD as a kernelized Wasserstein gradient flow of the chi-squared
divergence
- Authors: Sinho Chewi, Thibaut Le Gouic, Chen Lu, Tyler Maunu, Philippe Rigollet
- Abstract summary: Stein Variational Gradient Descent (SVGD) is often described as the kernelized gradient flow for the Kullback-Leibler divergence in the geometry of optimal transport.
We show that SVGD exhibits a strong form of uniform exponential ergodicity under conditions as weak as a Poincar'e inequality.
We propose Laplacian Adjusted Wasserstein Gradient Descent (LAWGD) that can be implemented from the spectral decomposition of the Laplacian operator associated with the target density.
- Score: 16.864125490806387
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Stein Variational Gradient Descent (SVGD), a popular sampling algorithm, is
often described as the kernelized gradient flow for the Kullback-Leibler
divergence in the geometry of optimal transport. We introduce a new perspective
on SVGD that instead views SVGD as the (kernelized) gradient flow of the
chi-squared divergence which, we show, exhibits a strong form of uniform
exponential ergodicity under conditions as weak as a Poincar\'e inequality.
This perspective leads us to propose an alternative to SVGD, called Laplacian
Adjusted Wasserstein Gradient Descent (LAWGD), that can be implemented from the
spectral decomposition of the Laplacian operator associated with the target
density. We show that LAWGD exhibits strong convergence guarantees and good
practical performance.
Related papers
- Accelerated Stein Variational Gradient Flow [2.384873896423002]
Stein variational gradient descent (SVGD) is a kernel-based particle method for sampling from a target distribution.
We introduce ASVGD, an accelerated SVGD, based on an accelerated gradient flow in a metric space of probability densities.
We derive a momentum-based discrete-time sampling algorithm, which evolves a set of particles deterministically.
arXiv Detail & Related papers (2025-03-30T14:37:21Z) - BG-Triangle: Bézier Gaussian Triangle for 3D Vectorization and Rendering [60.240908644910874]
Differentiable rendering enables efficient optimization by allowing gradients to be computed through the rendering process.
Existing solutions approximate or re-formulate traditional rendering operations using smooth, probabilistic proxies.
We present a novel hybrid representation that combines B'ezier triangle-based vector graphics primitives with Gaussian-based probabilistic models.
arXiv Detail & Related papers (2025-03-18T06:53:52Z) - Hellinger-Kantorovich Gradient Flows: Global Exponential Decay of Entropy Functionals [52.154685604660465]
We investigate a family of gradient flows of positive and probability measures, focusing on the Hellinger-Kantorovich (HK) geometry.
A central contribution is a complete characterization of global exponential decay behaviors of entropy functionals under Otto-Wasserstein and Hellinger-type gradient flows.
arXiv Detail & Related papers (2025-01-28T16:17:09Z) - Point Cloud Denoising With Fine-Granularity Dynamic Graph Convolutional Networks [58.050130177241186]
Noise perturbations often corrupt 3-D point clouds, hindering downstream tasks such as surface reconstruction, rendering, and further processing.
This paper introduces finegranularity dynamic graph convolutional networks called GDGCN, a novel approach to denoising in 3-D point clouds.
arXiv Detail & Related papers (2024-11-21T14:19:32Z) - Particle-based Variational Inference with Generalized Wasserstein
Gradient Flow [32.37056212527921]
We propose a ParVI framework, called generalized Wasserstein gradient descent (GWG)
We show that GWG exhibits strong convergence guarantees.
We also provide an adaptive version that automatically chooses Wasserstein metric to accelerate convergence.
arXiv Detail & Related papers (2023-10-25T10:05:42Z) - Augmented Message Passing Stein Variational Gradient Descent [3.5788754401889014]
We study the isotropy property of finite particles during the convergence process.
All particles tend to cluster around the particle center within a certain range.
Our algorithm achieves satisfactory accuracy and overcomes the variance collapse problem in various benchmark problems.
arXiv Detail & Related papers (2023-05-18T01:13:04Z) - Grassmann Stein Variational Gradient Descent [3.644031721554146]
Stein variational gradient descent (SVGD) is a deterministic particle inference algorithm that provides an efficient alternative to Markov chain Monte Carlo.
Recent developments have advocated projecting both the score function and the data onto real lines to sidestep this issue.
We propose Grassmann Stein variational gradient descent (GSVGD) as an alternative approach, which permits projections onto arbitrary dimensional subspaces.
arXiv Detail & Related papers (2022-02-07T15:36:03Z) - Large-Scale Wasserstein Gradient Flows [84.73670288608025]
We introduce a scalable scheme to approximate Wasserstein gradient flows.
Our approach relies on input neural networks (ICNNs) to discretize the JKO steps.
As a result, we can sample from the measure at each step of the gradient diffusion and compute its density.
arXiv Detail & Related papers (2021-06-01T19:21:48Z) - Faster Convergence of Stochastic Gradient Langevin Dynamics for
Non-Log-Concave Sampling [110.88857917726276]
We provide a new convergence analysis of gradient Langevin dynamics (SGLD) for sampling from a class of distributions that can be non-log-concave.
At the core of our approach is a novel conductance analysis of SGLD using an auxiliary time-reversible Markov Chain.
arXiv Detail & Related papers (2020-10-19T15:23:18Z) - Kernel Stein Generative Modeling [68.03537693810972]
Gradient Langevin Dynamics (SGLD) demonstrates impressive results with energy-based models on high-dimensional and complex data distributions.
Stein Variational Gradient Descent (SVGD) is a deterministic sampling algorithm that iteratively transports a set of particles to approximate a given distribution.
We propose noise conditional kernel SVGD (NCK-SVGD), that works in tandem with the recently introduced Noise Conditional Score Network estimator.
arXiv Detail & Related papers (2020-07-06T21:26:04Z) - A Non-Asymptotic Analysis for Stein Variational Gradient Descent [44.30569261307296]
We provide a novel finite time analysis for the Stein Variational Gradient Descent algorithm.
We provide a descent lemma establishing that the algorithm decreases the objective at each iteration.
We also provide a convergence result of the finite particle system corresponding to the practical implementation of SVGD to its population version.
arXiv Detail & Related papers (2020-06-17T12:01:33Z) - Stein Variational Inference for Discrete Distributions [70.19352762933259]
We propose a simple yet general framework that transforms discrete distributions to equivalent piecewise continuous distributions.
Our method outperforms traditional algorithms such as Gibbs sampling and discontinuous Hamiltonian Monte Carlo.
We demonstrate that our method provides a promising tool for learning ensembles of binarized neural network (BNN)
In addition, such transform can be straightforwardly employed in gradient-free kernelized Stein discrepancy to perform goodness-of-fit (GOF) test on discrete distributions.
arXiv Detail & Related papers (2020-03-01T22:45:41Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.