Kernel Stein Generative Modeling
- URL: http://arxiv.org/abs/2007.03074v1
- Date: Mon, 6 Jul 2020 21:26:04 GMT
- Title: Kernel Stein Generative Modeling
- Authors: Wei-Cheng Chang, Chun-Liang Li, Youssef Mroueh, Yiming Yang
- Abstract summary: Gradient Langevin Dynamics (SGLD) demonstrates impressive results with energy-based models on high-dimensional and complex data distributions.
Stein Variational Gradient Descent (SVGD) is a deterministic sampling algorithm that iteratively transports a set of particles to approximate a given distribution.
We propose noise conditional kernel SVGD (NCK-SVGD), that works in tandem with the recently introduced Noise Conditional Score Network estimator.
- Score: 68.03537693810972
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We are interested in gradient-based Explicit Generative Modeling where
samples can be derived from iterative gradient updates based on an estimate of
the score function of the data distribution. Recent advances in Stochastic
Gradient Langevin Dynamics (SGLD) demonstrates impressive results with
energy-based models on high-dimensional and complex data distributions. Stein
Variational Gradient Descent (SVGD) is a deterministic sampling algorithm that
iteratively transports a set of particles to approximate a given distribution,
based on functional gradient descent that decreases the KL divergence. SVGD has
promising results on several Bayesian inference applications. However, applying
SVGD on high dimensional problems is still under-explored. The goal of this
work is to study high dimensional inference with SVGD. We first identify key
challenges in practical kernel SVGD inference in high-dimension. We propose
noise conditional kernel SVGD (NCK-SVGD), that works in tandem with the
recently introduced Noise Conditional Score Network estimator. NCK is crucial
for successful inference with SVGD in high dimension, as it adapts the kernel
to the noise level of the score estimate. As we anneal the noise, NCK-SVGD
targets the real data distribution. We then extend the annealed SVGD with an
entropic regularization. We show that this offers a flexible control between
sample quality and diversity, and verify it empirically by precision and recall
evaluations. The NCK-SVGD produces samples comparable to GANs and annealed SGLD
on computer vision benchmarks, including MNIST and CIFAR-10.
Related papers
- Stein Variational Evolution Strategies [17.315583101484147]
Stein Variational Gradient Descent (SVGD) is a highly efficient method to sample from an unnormalized probability distribution.
Existing gradient-free versions of SVGD make use of simple Monte Carlo approximations or gradients from surrogate distributions, both with limitations.
We combine SVGD steps with evolution strategy (ES) updates to improve gradient-free Stein variational inference.
arXiv Detail & Related papers (2024-10-14T11:24:41Z) - ShapeSplat: A Large-scale Dataset of Gaussian Splats and Their Self-Supervised Pretraining [104.34751911174196]
We build a large-scale dataset of 3DGS using ShapeNet and ModelNet datasets.
Our dataset ShapeSplat consists of 65K objects from 87 unique categories.
We introduce textbftextitGaussian-MAE, which highlights the unique benefits of representation learning from Gaussian parameters.
arXiv Detail & Related papers (2024-08-20T14:49:14Z) - Long-time asymptotics of noisy SVGD outside the population limit [9.2081159465248]
We study the long-time behavior of a noisy variant of Stein Variational Gradient Descent (SVGD)
In particular, noisy SVGD provably avoids the variance collapse observed for SVGD.
Our approach involves demonstrating that the trajectories of noisy SVGD closely resemble those described by a McKean-Vlasov process.
arXiv Detail & Related papers (2024-06-17T13:00:51Z) - Consistent3D: Towards Consistent High-Fidelity Text-to-3D Generation with Deterministic Sampling Prior [87.55592645191122]
Score distillation sampling (SDS) and its variants have greatly boosted the development of text-to-3D generation, but are vulnerable to geometry collapse and poor textures yet.
We propose a novel and effective "Consistent3D" method that explores the ODE deterministic sampling prior for text-to-3D generation.
Experimental results show the efficacy of our Consistent3D in generating high-fidelity and diverse 3D objects and large-scale scenes.
arXiv Detail & Related papers (2024-01-17T08:32:07Z) - Provably Fast Finite Particle Variants of SVGD via Virtual Particle
Stochastic Approximation [9.065034043031668]
Stein Variational Gradient Descent (SVGD) is a popular variational inference which simulates an interacting particle system to approximately sample from a target distribution.
We introduce the notion of virtual particles and develop novel approximations of population-limit dynamics in the space of probability measures.
We show that the $n$ particles output by VP-SVGD and GB-SVGD, run for $T$ steps with batch-size $K$, are as good as i.i.i.d samples from a distribution whose Kernel Stein Discrepancy to the target is at most $Oleft(tfrac
arXiv Detail & Related papers (2023-05-27T19:21:28Z) - Score-based Diffusion Models in Function Space [140.792362459734]
Diffusion models have recently emerged as a powerful framework for generative modeling.
We introduce a mathematically rigorous framework called Denoising Diffusion Operators (DDOs) for training diffusion models in function space.
We show that the corresponding discretized algorithm generates accurate samples at a fixed cost independent of the data resolution.
arXiv Detail & Related papers (2023-02-14T23:50:53Z) - A stochastic Stein Variational Newton method [7.272730677575111]
We show that Stein variational Newton (sSVN) is a promising approach to accelerating high-precision Bayesian inference tasks.
We demonstrate the effectiveness of our algorithm on a difficult class of test problems -- the Hybrid Rosenbrock density -- and show that sSVN converges using three orders of fewer magnitude evaluations of the log likelihood.
arXiv Detail & Related papers (2022-04-19T17:57:36Z) - Grassmann Stein Variational Gradient Descent [3.644031721554146]
Stein variational gradient descent (SVGD) is a deterministic particle inference algorithm that provides an efficient alternative to Markov chain Monte Carlo.
Recent developments have advocated projecting both the score function and the data onto real lines to sidestep this issue.
We propose Grassmann Stein variational gradient descent (GSVGD) as an alternative approach, which permits projections onto arbitrary dimensional subspaces.
arXiv Detail & Related papers (2022-02-07T15:36:03Z) - Uncertainty Inspired RGB-D Saliency Detection [70.50583438784571]
We propose the first framework to employ uncertainty for RGB-D saliency detection by learning from the data labeling process.
Inspired by the saliency data labeling process, we propose a generative architecture to achieve probabilistic RGB-D saliency detection.
Results on six challenging RGB-D benchmark datasets show our approach's superior performance in learning the distribution of saliency maps.
arXiv Detail & Related papers (2020-09-07T13:01:45Z) - Stein Variational Inference for Discrete Distributions [70.19352762933259]
We propose a simple yet general framework that transforms discrete distributions to equivalent piecewise continuous distributions.
Our method outperforms traditional algorithms such as Gibbs sampling and discontinuous Hamiltonian Monte Carlo.
We demonstrate that our method provides a promising tool for learning ensembles of binarized neural network (BNN)
In addition, such transform can be straightforwardly employed in gradient-free kernelized Stein discrepancy to perform goodness-of-fit (GOF) test on discrete distributions.
arXiv Detail & Related papers (2020-03-01T22:45:41Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.