Particle-based Energetic Variational Inference
- URL: http://arxiv.org/abs/2004.06443v4
- Date: Tue, 23 Mar 2021 18:29:58 GMT
- Title: Particle-based Energetic Variational Inference
- Authors: Yiwei Wang, Jiuhai Chen, Chun Liu, Lulu Kang
- Abstract summary: We introduce a new variational inference (VI) framework, called energetic variational inference (EVI)
We derive many existing Particle-based Variational Inference (ParVI) methods, including the popular Stein Variational Gradient Descent (SVGD) approach.
We propose a new particle-based EVI scheme, which performs the particle-based approximation of the density first and then uses the approximated density in the variational procedure.
- Score: 4.079427359693159
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We introduce a new variational inference (VI) framework, called energetic
variational inference (EVI). It minimizes the VI objective function based on a
prescribed energy-dissipation law. Using the EVI framework, we can derive many
existing Particle-based Variational Inference (ParVI) methods, including the
popular Stein Variational Gradient Descent (SVGD) approach. More importantly,
many new ParVI schemes can be created under this framework. For illustration,
we propose a new particle-based EVI scheme, which performs the particle-based
approximation of the density first and then uses the approximated density in
the variational procedure, or "Approximation-then-Variation" for short. Thanks
to this order of approximation and variation, the new scheme can maintain the
variational structure at the particle level, and can significantly decrease the
KL-divergence in each iteration. Numerical experiments show the proposed method
outperforms some existing ParVI methods in terms of fidelity to the target
distribution.
Related papers
- Particle Semi-Implicit Variational Inference [2.555222031881788]
Semi-implicit variational inference (SIVI) enriches the expressiveness of variational families by utilizing a kernel and a mixing distribution.
Existing SIVI methods parameterize the mixing distribution using implicit distributions, leading to intractable variational densities.
We propose a novel method for SIVI called Particle Variational Inference (PVI) which employs empirical measures to approximate the optimal mixing distributions.
arXiv Detail & Related papers (2024-06-30T10:21:41Z) - GAD-PVI: A General Accelerated Dynamic-Weight Particle-Based Variational
Inference Framework [11.4522103360875]
We propose the first ParVI framework that possesses both accelerated position update and dynamical weight adjustment simultaneously.
GAD-PVI is compatible with different dissimilarity functionals and associated smoothing approaches.
Experiments on both synthetic and real-world data demonstrate the faster convergence and reduced approximation error of GAD-PVI methods.
arXiv Detail & Related papers (2023-12-27T06:31:06Z) - Sampling with Mollified Interaction Energy Descent [57.00583139477843]
We present a new optimization-based method for sampling called mollified interaction energy descent (MIED)
MIED minimizes a new class of energies on probability measures called mollified interaction energies (MIEs)
We show experimentally that for unconstrained sampling problems our algorithm performs on par with existing particle-based algorithms like SVGD.
arXiv Detail & Related papers (2022-10-24T16:54:18Z) - Stochastic Gradient Descent-Ascent: Unified Theory and New Efficient
Methods [73.35353358543507]
Gradient Descent-Ascent (SGDA) is one of the most prominent algorithms for solving min-max optimization and variational inequalities problems (VIP)
In this paper, we propose a unified convergence analysis that covers a large variety of descent-ascent methods.
We develop several new variants of SGDA such as a new variance-reduced method (L-SVRGDA), new distributed methods with compression (QSGDA, DIANA-SGDA, VR-DIANA-SGDA), and a new method with coordinate randomization (SEGA-SGDA)
arXiv Detail & Related papers (2022-02-15T09:17:39Z) - Variational Inference with Holder Bounds [68.8008396694788]
We present a careful analysis of the thermodynamic variational objective (TVO)
We reveal how the pathological geometry of thermodynamic curves negatively affects TVO.
This motivates our new VI objectives, named the Holder bounds, which flatten the thermodynamic curves and promise to achieve a one-step approximation of the exact marginal log-likelihood.
arXiv Detail & Related papers (2021-11-04T15:35:47Z) - An Introduction to Variational Inference [0.0]
In this paper, we introduce the concept of Variational Inference (VI)
VI is a popular method in machine learning that uses optimization techniques to estimate complex probability densities.
We discuss the applications of VI to variational auto-encoders (VAE) and VAE-Generative Adversarial Network (VAE-GAN)
arXiv Detail & Related papers (2021-08-30T09:40:04Z) - Loss function based second-order Jensen inequality and its application
to particle variational inference [112.58907653042317]
Particle variational inference (PVI) uses an ensemble of models as an empirical approximation for the posterior distribution.
PVI iteratively updates each model with a repulsion force to ensure the diversity of the optimized models.
We derive a novel generalization error bound and show that it can be reduced by enhancing the diversity of models.
arXiv Detail & Related papers (2021-06-09T12:13:51Z) - A Discrete Variational Derivation of Accelerated Methods in Optimization [68.8204255655161]
We introduce variational which allow us to derive different methods for optimization.
We derive two families of optimization methods in one-to-one correspondence.
The preservation of symplecticity of autonomous systems occurs here solely on the fibers.
arXiv Detail & Related papers (2021-06-04T20:21:53Z) - Variational Rejection Particle Filtering [28.03831528555717]
Variational Rejection Particle Filtering (VRPF) leads to novel variational bounds on the marginal likelihood.
We present theoretical properties of the variational bound and demonstrate experiments on various models of sequential data.
arXiv Detail & Related papers (2021-03-29T05:29:58Z) - Generative Particle Variational Inference via Estimation of Functional
Gradients [15.370890881254066]
This work proposes a new method for learning to approximately sample from the posterior distribution.
Our generative ParVI (GPVI) approach maintains the performance of ParVI methods while offering the flexibility of a generative sampler.
arXiv Detail & Related papers (2021-03-01T20:29:41Z) - Meta-Learning Divergences of Variational Inference [49.164944557174294]
Variational inference (VI) plays an essential role in approximate Bayesian inference.
We propose a meta-learning algorithm to learn the divergence metric suited for the task of interest.
We demonstrate our approach outperforms standard VI on Gaussian mixture distribution approximation.
arXiv Detail & Related papers (2020-07-06T17:43:01Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.