SVGD as a kernelized Wasserstein gradient flow of the chi-squared
divergence
- URL: http://arxiv.org/abs/2006.02509v1
- Date: Wed, 3 Jun 2020 20:20:21 GMT
- Title: SVGD as a kernelized Wasserstein gradient flow of the chi-squared
divergence
- Authors: Sinho Chewi, Thibaut Le Gouic, Chen Lu, Tyler Maunu, Philippe Rigollet
- Abstract summary: Stein Variational Gradient Descent (SVGD) is often described as the kernelized gradient flow for the Kullback-Leibler divergence in the geometry of optimal transport.
We show that SVGD exhibits a strong form of uniform exponential ergodicity under conditions as weak as a Poincar'e inequality.
We propose Laplacian Adjusted Wasserstein Gradient Descent (LAWGD) that can be implemented from the spectral decomposition of the Laplacian operator associated with the target density.
- Score: 16.864125490806387
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Stein Variational Gradient Descent (SVGD), a popular sampling algorithm, is
often described as the kernelized gradient flow for the Kullback-Leibler
divergence in the geometry of optimal transport. We introduce a new perspective
on SVGD that instead views SVGD as the (kernelized) gradient flow of the
chi-squared divergence which, we show, exhibits a strong form of uniform
exponential ergodicity under conditions as weak as a Poincar\'e inequality.
This perspective leads us to propose an alternative to SVGD, called Laplacian
Adjusted Wasserstein Gradient Descent (LAWGD), that can be implemented from the
spectral decomposition of the Laplacian operator associated with the target
density. We show that LAWGD exhibits strong convergence guarantees and good
practical performance.
Related papers
- Particle-based Variational Inference with Generalized Wasserstein
Gradient Flow [32.37056212527921]
We propose a ParVI framework, called generalized Wasserstein gradient descent (GWG)
We show that GWG exhibits strong convergence guarantees.
We also provide an adaptive version that automatically chooses Wasserstein metric to accelerate convergence.
arXiv Detail & Related papers (2023-10-25T10:05:42Z) - Augmented Message Passing Stein Variational Gradient Descent [3.5788754401889014]
We study the isotropy property of finite particles during the convergence process.
All particles tend to cluster around the particle center within a certain range.
Our algorithm achieves satisfactory accuracy and overcomes the variance collapse problem in various benchmark problems.
arXiv Detail & Related papers (2023-05-18T01:13:04Z) - Detecting Rotated Objects as Gaussian Distributions and Its 3-D
Generalization [81.29406957201458]
Existing detection methods commonly use a parameterized bounding box (BBox) to model and detect (horizontal) objects.
We argue that such a mechanism has fundamental limitations in building an effective regression loss for rotation detection.
We propose to model the rotated objects as Gaussian distributions.
We extend our approach from 2-D to 3-D with a tailored algorithm design to handle the heading estimation.
arXiv Detail & Related papers (2022-09-22T07:50:48Z) - Grassmann Stein Variational Gradient Descent [3.644031721554146]
Stein variational gradient descent (SVGD) is a deterministic particle inference algorithm that provides an efficient alternative to Markov chain Monte Carlo.
Recent developments have advocated projecting both the score function and the data onto real lines to sidestep this issue.
We propose Grassmann Stein variational gradient descent (GSVGD) as an alternative approach, which permits projections onto arbitrary dimensional subspaces.
arXiv Detail & Related papers (2022-02-07T15:36:03Z) - Large-Scale Wasserstein Gradient Flows [84.73670288608025]
We introduce a scalable scheme to approximate Wasserstein gradient flows.
Our approach relies on input neural networks (ICNNs) to discretize the JKO steps.
As a result, we can sample from the measure at each step of the gradient diffusion and compute its density.
arXiv Detail & Related papers (2021-06-01T19:21:48Z) - Faster Convergence of Stochastic Gradient Langevin Dynamics for
Non-Log-Concave Sampling [110.88857917726276]
We provide a new convergence analysis of gradient Langevin dynamics (SGLD) for sampling from a class of distributions that can be non-log-concave.
At the core of our approach is a novel conductance analysis of SGLD using an auxiliary time-reversible Markov Chain.
arXiv Detail & Related papers (2020-10-19T15:23:18Z) - Kernel Stein Generative Modeling [68.03537693810972]
Gradient Langevin Dynamics (SGLD) demonstrates impressive results with energy-based models on high-dimensional and complex data distributions.
Stein Variational Gradient Descent (SVGD) is a deterministic sampling algorithm that iteratively transports a set of particles to approximate a given distribution.
We propose noise conditional kernel SVGD (NCK-SVGD), that works in tandem with the recently introduced Noise Conditional Score Network estimator.
arXiv Detail & Related papers (2020-07-06T21:26:04Z) - A Non-Asymptotic Analysis for Stein Variational Gradient Descent [44.30569261307296]
We provide a novel finite time analysis for the Stein Variational Gradient Descent algorithm.
We provide a descent lemma establishing that the algorithm decreases the objective at each iteration.
We also provide a convergence result of the finite particle system corresponding to the practical implementation of SVGD to its population version.
arXiv Detail & Related papers (2020-06-17T12:01:33Z) - Stein Variational Inference for Discrete Distributions [70.19352762933259]
We propose a simple yet general framework that transforms discrete distributions to equivalent piecewise continuous distributions.
Our method outperforms traditional algorithms such as Gibbs sampling and discontinuous Hamiltonian Monte Carlo.
We demonstrate that our method provides a promising tool for learning ensembles of binarized neural network (BNN)
In addition, such transform can be straightforwardly employed in gradient-free kernelized Stein discrepancy to perform goodness-of-fit (GOF) test on discrete distributions.
arXiv Detail & Related papers (2020-03-01T22:45:41Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.