The reproducing Stein kernel approach for post-hoc corrected sampling
- URL: http://arxiv.org/abs/2001.09266v2
- Date: Mon, 13 Sep 2021 07:07:16 GMT
- Title: The reproducing Stein kernel approach for post-hoc corrected sampling
- Authors: Liam Hodgkinson, Robert Salomone, Fred Roosta
- Abstract summary: We prove that Stein importance sampling yields consistent estimators for quantities related to a target distribution of interest.
A universal theory of reproducing Stein kernels is established, which enables the construction of kernelized Stein discrepancy on general Polish spaces.
- Score: 11.967340182951464
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Stein importance sampling is a widely applicable technique based on
kernelized Stein discrepancy, which corrects the output of approximate sampling
algorithms by reweighting the empirical distribution of the samples. A general
analysis of this technique is conducted for the previously unconsidered setting
where samples are obtained via the simulation of a Markov chain, and applies to
an arbitrary underlying Polish space. We prove that Stein importance sampling
yields consistent estimators for quantities related to a target distribution of
interest by using samples obtained from a geometrically ergodic Markov chain
with a possibly unknown invariant measure that differs from the desired target.
The approach is shown to be valid under conditions that are satisfied for a
large number of unadjusted samplers, and is capable of retaining consistency
when data subsampling is used. Along the way, a universal theory of reproducing
Stein kernels is established, which enables the construction of kernelized
Stein discrepancy on general Polish spaces, and provides sufficient conditions
for kernels to be convergence-determining on such spaces. These results are of
independent interest for the development of future methodology based on
kernelized Stein discrepancies.
Related papers
- From Denoising Score Matching to Langevin Sampling: A Fine-Grained Error Analysis in the Gaussian Setting [25.21429354164613]
We analyze the sampling process in a simple yet representative setting using a Langevin diffusion sampler.
We show that the Wasserstein sampling error can be expressed as a kernel-type norm of the data power spectrum.
arXiv Detail & Related papers (2025-03-14T17:35:00Z) - Convergence of Score-Based Discrete Diffusion Models: A Discrete-Time Analysis [56.442307356162864]
We study the theoretical aspects of score-based discrete diffusion models under the Continuous Time Markov Chain (CTMC) framework.
We introduce a discrete-time sampling algorithm in the general state space $[S]d$ that utilizes score estimators at predefined time points.
Our convergence analysis employs a Girsanov-based method and establishes key properties of the discrete score function.
arXiv Detail & Related papers (2024-10-03T09:07:13Z) - Stochastic Localization via Iterative Posterior Sampling [2.1383136715042417]
We consider a general localization framework and introduce an explicit class of observation processes, associated with flexible denoising schedules.
We provide a complete methodology, $textitStochastic localization via Iterative Posterior Sampling$ (SLIPS), to obtain approximate samples of this dynamics, and as a byproduct, samples from the target distribution.
We illustrate the benefits and applicability of SLIPS on several benchmarks of multi-modal distributions, including mixtures in increasing dimensions, logistic regression and high-dimensional field system from statistical-mechanics.
arXiv Detail & Related papers (2024-02-16T15:28:41Z) - Distributed Markov Chain Monte Carlo Sampling based on the Alternating
Direction Method of Multipliers [143.6249073384419]
In this paper, we propose a distributed sampling scheme based on the alternating direction method of multipliers.
We provide both theoretical guarantees of our algorithm's convergence and experimental evidence of its superiority to the state-of-the-art.
In simulation, we deploy our algorithm on linear and logistic regression tasks and illustrate its fast convergence compared to existing gradient-based methods.
arXiv Detail & Related papers (2024-01-29T02:08:40Z) - Mean-Square Analysis of Discretized It\^o Diffusions for Heavy-tailed
Sampling [17.415391025051434]
We analyze the complexity of sampling from a class of heavy-tailed distributions by discretizing a natural class of Ito diffusions associated with weighted Poincar'e inequalities.
Based on a mean-square analysis, we establish the iteration complexity for obtaining a sample whose distribution is $epsilon$ close to the target distribution in the Wasserstein-2 metric.
arXiv Detail & Related papers (2023-03-01T15:16:03Z) - Unsupervised Learning of Sampling Distributions for Particle Filters [80.6716888175925]
We put forward four methods for learning sampling distributions from observed measurements.
Experiments demonstrate that learned sampling distributions exhibit better performance than designed, minimum-degeneracy sampling distributions.
arXiv Detail & Related papers (2023-02-02T15:50:21Z) - Variational Autoencoder Kernel Interpretation and Selection for
Classification [59.30734371401315]
This work proposed kernel selection approaches for probabilistic classifiers based on features produced by the convolutional encoder of a variational autoencoder.
In the proposed implementation, each latent variable was sampled from the distribution associated with a single kernel of the last encoder's convolution layer, as an individual distribution was created for each kernel.
choosing relevant features on the sampled latent variables makes it possible to perform kernel selection, filtering the uninformative features and kernels.
arXiv Detail & Related papers (2022-09-10T17:22:53Z) - Flow-based sampling in the lattice Schwinger model at criticality [54.48885403692739]
Flow-based algorithms may provide efficient sampling of field distributions for lattice field theory applications.
We provide a numerical demonstration of robust flow-based sampling in the Schwinger model at the critical value of the fermion mass.
arXiv Detail & Related papers (2022-02-23T19:00:00Z) - Generalized Kernel Ridge Regression for Causal Inference with
Missing-at-Random Sample Selection [3.398662563413433]
I propose kernel ridge regression estimators for nonparametric dose response curves and semiparametric treatment effects.
For the discrete treatment case, I prove root-n consistency, Gaussian approximation, and semiparametric efficiency.
arXiv Detail & Related papers (2021-11-09T17:10:49Z) - Nested sampling with any prior you like [0.0]
Bijectors trained on samples from a desired prior density provide a general-purpose method for constructing transformations.
We demonstrate the use of trained bijectors in conjunction with nested sampling on a number of examples from cosmology.
arXiv Detail & Related papers (2021-02-24T18:45:13Z) - Annealed Stein Variational Gradient Descent [4.020523898765405]
Stein variational gradient descent has gained attention in the approximate literature inference for its flexibility and accuracy.
We empirically explore the ability of this method to sample from multi-modal distributions and focus on two important issues: (i) the inability of the particles to escape from local modes and (ii) the inefficacy in reproducing the density of the different regions.
arXiv Detail & Related papers (2021-01-24T22:18:30Z) - Nonparametric Score Estimators [49.42469547970041]
Estimating the score from a set of samples generated by an unknown distribution is a fundamental task in inference and learning of probabilistic models.
We provide a unifying view of these estimators under the framework of regularized nonparametric regression.
We propose score estimators based on iterative regularization that enjoy computational benefits from curl-free kernels and fast convergence.
arXiv Detail & Related papers (2020-05-20T15:01:03Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.