Stein Variational Evolution Strategies
- URL: http://arxiv.org/abs/2410.10390v1
- Date: Mon, 14 Oct 2024 11:24:41 GMT
- Title: Stein Variational Evolution Strategies
- Authors: Cornelius V. Braun, Robert T. Lange, Marc Toussaint,
- Abstract summary: Stein Variational Gradient Descent (SVGD) is a highly efficient method to sample from an unnormalized probability distribution.
Existing gradient-free versions of SVGD make use of simple Monte Carlo approximations or gradients from surrogate distributions, both with limitations.
We combine SVGD steps with evolution strategy (ES) updates to improve gradient-free Stein variational inference.
- Score: 17.315583101484147
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Stein Variational Gradient Descent (SVGD) is a highly efficient method to sample from an unnormalized probability distribution. However, the SVGD update relies on gradients of the log-density, which may not always be available. Existing gradient-free versions of SVGD make use of simple Monte Carlo approximations or gradients from surrogate distributions, both with limitations. To improve gradient-free Stein variational inference, we combine SVGD steps with evolution strategy (ES) updates. Our results demonstrate that the resulting algorithm generates high-quality samples from unnormalized target densities without requiring gradient information. Compared to prior gradient-free SVGD methods, we find that the integration of the ES update in SVGD significantly improves the performance on multiple challenging benchmark problems.
Related papers
- Mipmap-GS: Let Gaussians Deform with Scale-specific Mipmap for Anti-aliasing Rendering [81.88246351984908]
We propose a unified optimization method to make Gaussians adaptive for arbitrary scales.
Inspired by the mipmap technique, we design pseudo ground-truth for the target scale and propose a scale-consistency guidance loss to inject scale information into 3D Gaussians.
Our method outperforms 3DGS in PSNR by an average of 9.25 dB for zoom-in and 10.40 dB for zoom-out.
arXiv Detail & Related papers (2024-08-12T16:49:22Z) - VividDreamer: Towards High-Fidelity and Efficient Text-to-3D Generation [69.68568248073747]
We propose Pose-dependent Consistency Distillation Sampling (PCDS), a novel yet efficient objective for diffusion-based 3D generation tasks.
PCDS builds the pose-dependent consistency function within diffusion trajectories, allowing to approximate true gradients through minimal sampling steps.
For efficient generation, we propose a coarse-to-fine optimization strategy, which first utilizes 1-step PCDS to create the basic structure of 3D objects, and then gradually increases PCDS steps to generate fine-grained details.
arXiv Detail & Related papers (2024-06-21T08:21:52Z) - Long-time asymptotics of noisy SVGD outside the population limit [9.2081159465248]
We study the long-time behavior of a noisy variant of Stein Variational Gradient Descent (SVGD)
In particular, noisy SVGD provably avoids the variance collapse observed for SVGD.
Our approach involves demonstrating that the trajectories of noisy SVGD closely resemble those described by a McKean-Vlasov process.
arXiv Detail & Related papers (2024-06-17T13:00:51Z) - SteinDreamer: Variance Reduction for Text-to-3D Score Distillation via Stein Identity [70.32101198891465]
We show that gradient estimation in score distillation is inherent to high variance.
We propose a more general solution to reduce variance for score distillation, termed Stein Score Distillation (SSD)
We demonstrate that SteinDreamer achieves faster convergence than existing methods due to more stable gradient updates.
arXiv Detail & Related papers (2023-12-31T23:04:25Z) - Augmented Message Passing Stein Variational Gradient Descent [3.5788754401889014]
We study the isotropy property of finite particles during the convergence process.
All particles tend to cluster around the particle center within a certain range.
Our algorithm achieves satisfactory accuracy and overcomes the variance collapse problem in various benchmark problems.
arXiv Detail & Related papers (2023-05-18T01:13:04Z) - A stochastic Stein Variational Newton method [7.272730677575111]
We show that Stein variational Newton (sSVN) is a promising approach to accelerating high-precision Bayesian inference tasks.
We demonstrate the effectiveness of our algorithm on a difficult class of test problems -- the Hybrid Rosenbrock density -- and show that sSVN converges using three orders of fewer magnitude evaluations of the log likelihood.
arXiv Detail & Related papers (2022-04-19T17:57:36Z) - Grassmann Stein Variational Gradient Descent [3.644031721554146]
Stein variational gradient descent (SVGD) is a deterministic particle inference algorithm that provides an efficient alternative to Markov chain Monte Carlo.
Recent developments have advocated projecting both the score function and the data onto real lines to sidestep this issue.
We propose Grassmann Stein variational gradient descent (GSVGD) as an alternative approach, which permits projections onto arbitrary dimensional subspaces.
arXiv Detail & Related papers (2022-02-07T15:36:03Z) - Exploiting Adam-like Optimization Algorithms to Improve the Performance
of Convolutional Neural Networks [82.61182037130405]
gradient descent (SGD) is the main approach for training deep networks.
In this work, we compare Adam based variants based on the difference between the present and the past gradients.
We have tested ensemble of networks and the fusion with ResNet50 trained with gradient descent.
arXiv Detail & Related papers (2021-03-26T18:55:08Z) - Kernel Stein Generative Modeling [68.03537693810972]
Gradient Langevin Dynamics (SGLD) demonstrates impressive results with energy-based models on high-dimensional and complex data distributions.
Stein Variational Gradient Descent (SVGD) is a deterministic sampling algorithm that iteratively transports a set of particles to approximate a given distribution.
We propose noise conditional kernel SVGD (NCK-SVGD), that works in tandem with the recently introduced Noise Conditional Score Network estimator.
arXiv Detail & Related papers (2020-07-06T21:26:04Z) - Stein Variational Inference for Discrete Distributions [70.19352762933259]
We propose a simple yet general framework that transforms discrete distributions to equivalent piecewise continuous distributions.
Our method outperforms traditional algorithms such as Gibbs sampling and discontinuous Hamiltonian Monte Carlo.
We demonstrate that our method provides a promising tool for learning ensembles of binarized neural network (BNN)
In addition, such transform can be straightforwardly employed in gradient-free kernelized Stein discrepancy to perform goodness-of-fit (GOF) test on discrete distributions.
arXiv Detail & Related papers (2020-03-01T22:45:41Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.