Heavy-tailed Sampling via Transformed Unadjusted Langevin Algorithm
- URL: http://arxiv.org/abs/2201.08349v1
- Date: Thu, 20 Jan 2022 18:32:41 GMT
- Title: Heavy-tailed Sampling via Transformed Unadjusted Langevin Algorithm
- Authors: Ye He and Krishnakumar Balasubramanian and Murat A. Erdogdu
- Abstract summary: We analyze the complexity of sampling from oraclely decaying heavy-tailed target densities.
The specific class of closed-form transformation maps that we construct are shown to be diffeomorphisms.
- Score: 18.541857410928387
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We analyze the oracle complexity of sampling from polynomially decaying
heavy-tailed target densities based on running the Unadjusted Langevin
Algorithm on certain transformed versions of the target density. The specific
class of closed-form transformation maps that we construct are shown to be
diffeomorphisms, and are particularly suited for developing efficient
diffusion-based samplers. We characterize the precise class of heavy-tailed
densities for which polynomial-order oracle complexities (in dimension and
inverse target accuracy) could be obtained, and provide illustrative examples.
We highlight the relationship between our assumptions and functional
inequalities (super and weak Poincar\'e inequalities) based on non-local
Dirichlet forms defined via fractional Laplacian operators, used to
characterize the heavy-tailed equilibrium densities of certain stable-driven
stochastic differential equations.
Related papers
- Weak Generative Sampler to Efficiently Sample Invariant Distribution of Stochastic Differential Equation [8.67581853745823]
Current deep learning-based method solves the stationary Fokker--Planck equation to determine the invariant probability density function in form of deep neural networks.
We introduce a framework that employs a weak generative sampler (WGS) to directly generate independent and identically distributed (iid) samples.
Our proposed loss function is based on the weak form of the Fokker--Planck equation, integrating normalizing flows to characterize the invariant distribution.
arXiv Detail & Related papers (2024-05-29T16:41:42Z) - On the Trajectory Regularity of ODE-based Diffusion Sampling [79.17334230868693]
Diffusion-based generative models use differential equations to establish a smooth connection between a complex data distribution and a tractable prior distribution.
In this paper, we identify several intriguing trajectory properties in the ODE-based sampling process of diffusion models.
arXiv Detail & Related papers (2024-05-18T15:59:41Z) - Distributed Markov Chain Monte Carlo Sampling based on the Alternating
Direction Method of Multipliers [143.6249073384419]
In this paper, we propose a distributed sampling scheme based on the alternating direction method of multipliers.
We provide both theoretical guarantees of our algorithm's convergence and experimental evidence of its superiority to the state-of-the-art.
In simulation, we deploy our algorithm on linear and logistic regression tasks and illustrate its fast convergence compared to existing gradient-based methods.
arXiv Detail & Related papers (2024-01-29T02:08:40Z) - Noise-Free Sampling Algorithms via Regularized Wasserstein Proximals [3.4240632942024685]
We consider the problem of sampling from a distribution governed by a potential function.
This work proposes an explicit score based MCMC method that is deterministic, resulting in a deterministic evolution for particles.
arXiv Detail & Related papers (2023-08-28T23:51:33Z) - A Geometric Perspective on Diffusion Models [57.27857591493788]
We inspect the ODE-based sampling of a popular variance-exploding SDE.
We establish a theoretical relationship between the optimal ODE-based sampling and the classic mean-shift (mode-seeking) algorithm.
arXiv Detail & Related papers (2023-05-31T15:33:16Z) - Mean-Square Analysis of Discretized It\^o Diffusions for Heavy-tailed
Sampling [17.415391025051434]
We analyze the complexity of sampling from a class of heavy-tailed distributions by discretizing a natural class of Ito diffusions associated with weighted Poincar'e inequalities.
Based on a mean-square analysis, we establish the iteration complexity for obtaining a sample whose distribution is $epsilon$ close to the target distribution in the Wasserstein-2 metric.
arXiv Detail & Related papers (2023-03-01T15:16:03Z) - Variational Monte Carlo Approach to Partial Differential Equations with
Neural Networks [0.0]
We develop a variational approach for solving partial differential equations governing the evolution of high dimensional probability distributions.
Our approach naturally works on the unbounded continuous domain and encodes the full probability density function through its variational parameters.
For the considered benchmark cases we observe excellent agreement with numerical solutions as well as analytical solutions in regimes inaccessible to traditional computational approaches.
arXiv Detail & Related papers (2022-06-04T07:36:35Z) - Efficient CDF Approximations for Normalizing Flows [64.60846767084877]
We build upon the diffeomorphic properties of normalizing flows to estimate the cumulative distribution function (CDF) over a closed region.
Our experiments on popular flow architectures and UCI datasets show a marked improvement in sample efficiency as compared to traditional estimators.
arXiv Detail & Related papers (2022-02-23T06:11:49Z) - Resampling Base Distributions of Normalizing Flows [0.0]
We introduce a base distribution for normalizing flows based on learned rejection sampling.
We develop suitable learning algorithms using both maximizing the log-likelihood and the optimization of the reverse Kullback-Leibler divergence.
arXiv Detail & Related papers (2021-10-29T14:44:44Z) - Learning Equivariant Energy Based Models with Equivariant Stein
Variational Gradient Descent [80.73580820014242]
We focus on the problem of efficient sampling and learning of probability densities by incorporating symmetries in probabilistic models.
We first introduce Equivariant Stein Variational Gradient Descent algorithm -- an equivariant sampling method based on Stein's identity for sampling from densities with symmetries.
We propose new ways of improving and scaling up training of energy based models.
arXiv Detail & Related papers (2021-06-15T01:35:17Z) - Local and Global Uniform Convexity Conditions [88.3673525964507]
We review various characterizations of uniform convexity and smoothness on norm balls in finite-dimensional spaces.
We establish local versions of these conditions to provide sharper insights on a recent body of complexity results in learning theory, online learning, or offline optimization.
We conclude with some practical examples in optimization and machine learning where leveraging these conditions and localized assumptions lead to new complexity results.
arXiv Detail & Related papers (2021-02-09T21:09:53Z) - Optimal oracle inequalities for solving projected fixed-point equations [53.31620399640334]
We study methods that use a collection of random observations to compute approximate solutions by searching over a known low-dimensional subspace of the Hilbert space.
We show how our results precisely characterize the error of a class of temporal difference learning methods for the policy evaluation problem with linear function approximation.
arXiv Detail & Related papers (2020-12-09T20:19:32Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.