Using Perturbation to Improve Goodness-of-Fit Tests based on Kernelized
Stein Discrepancy
- URL: http://arxiv.org/abs/2304.14762v3
- Date: Sun, 4 Jun 2023 18:37:26 GMT
- Title: Using Perturbation to Improve Goodness-of-Fit Tests based on Kernelized
Stein Discrepancy
- Authors: Xing Liu, Andrew B. Duncan, Axel Gandy
- Abstract summary: Kernelized Stein discrepancy (KSD) is a score-based discrepancy widely used in goodness-of-fit tests.
We show theoretically and empirically that the KSD test can suffer from low power when the target and the alternative distributions have the same well-separated modes but differ in mixing proportions.
- Score: 3.78967502155084
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Kernelized Stein discrepancy (KSD) is a score-based discrepancy widely used
in goodness-of-fit tests. It can be applied even when the target distribution
has an unknown normalising factor, such as in Bayesian analysis. We show
theoretically and empirically that the KSD test can suffer from low power when
the target and the alternative distributions have the same well-separated modes
but differ in mixing proportions. We propose to perturb the observed sample via
Markov transition kernels, with respect to which the target distribution is
invariant. This allows us to then employ the KSD test on the perturbed sample.
We provide numerical evidence that with suitably chosen transition kernels the
proposed approach can lead to substantially higher power than the KSD test.
Related papers
- Theory on Score-Mismatched Diffusion Models and Zero-Shot Conditional Samplers [49.97755400231656]
We present the first performance guarantee with explicit dimensional general score-mismatched diffusion samplers.
We show that score mismatches result in an distributional bias between the target and sampling distributions, proportional to the accumulated mismatch between the target and training distributions.
This result can be directly applied to zero-shot conditional samplers for any conditional model, irrespective of measurement noise.
arXiv Detail & Related papers (2024-10-17T16:42:12Z) - NETS: A Non-Equilibrium Transport Sampler [15.58993313831079]
We propose an algorithm, termed the Non-Equilibrium Transport Sampler (NETS)
NETS can be viewed as a variant of importance sampling (AIS) based on Jarzynski's equality.
We show that this drift is the minimizer of a variety of objective functions, which can all be estimated in an unbiased fashion.
arXiv Detail & Related papers (2024-10-03T17:35:38Z) - Adaptive Annealed Importance Sampling with Constant Rate Progress [68.8204255655161]
Annealed Importance Sampling (AIS) synthesizes weighted samples from an intractable distribution.
We propose the Constant Rate AIS algorithm and its efficient implementation for $alpha$-divergences.
arXiv Detail & Related papers (2023-06-27T08:15:28Z) - Controlling Moments with Kernel Stein Discrepancies [74.82363458321939]
Kernel Stein discrepancies (KSDs) measure the quality of a distributional approximation.
We first show that standard KSDs used for weak convergence control fail to control moment convergence.
We then provide sufficient conditions under which alternative diffusion KSDs control both moment and weak convergence.
arXiv Detail & Related papers (2022-11-10T08:24:52Z) - DensePure: Understanding Diffusion Models towards Adversarial Robustness [110.84015494617528]
We analyze the properties of diffusion models and establish the conditions under which they can enhance certified robustness.
We propose a new method DensePure, designed to improve the certified robustness of a pretrained model (i.e. a classifier)
We show that this robust region is a union of multiple convex sets, and is potentially much larger than the robust regions identified in previous works.
arXiv Detail & Related papers (2022-11-01T08:18:07Z) - A kernel Stein test of goodness of fit for sequential models [19.8408003104988]
The proposed measure is an instance of the kernel Stein discrepancy (KSD), which has been used to construct goodness-of-fit tests for unnormalized densities.
We extend the KSD to the variable-dimension setting by identifying appropriate Stein operators, and propose a novel KSD goodness-of-fit test.
Our test is shown to perform well in practice on discrete sequential data benchmarks.
arXiv Detail & Related papers (2022-10-19T17:30:15Z) - Targeted Separation and Convergence with Kernel Discrepancies [61.973643031360254]
kernel-based discrepancy measures are required to (i) separate a target P from other probability measures or (ii) control weak convergence to P.
In this article we derive new sufficient and necessary conditions to ensure (i) and (ii)
For MMDs on separable metric spaces, we characterize those kernels that separate Bochner embeddable measures and introduce simple conditions for separating all measures with unbounded kernels.
arXiv Detail & Related papers (2022-09-26T16:41:16Z) - A Fourier representation of kernel Stein discrepancy with application to
Goodness-of-Fit tests for measures on infinite dimensional Hilbert spaces [6.437931786032493]
Kernel Stein discrepancy (KSD) is a kernel-based measure of discrepancy between probability measures.
We provide the first analysis of KSD in the generality of data lying in a separable Hilbert space.
This allows us to prove that KSD can separate measures and thus is valid to use in practice.
arXiv Detail & Related papers (2022-06-09T15:04:18Z) - KSD Aggregated Goodness-of-fit Test [38.45086141837479]
We introduce a strategy to construct a test, called KSDAgg, which aggregates multiple tests with different kernels.
We provide non-asymptotic guarantees on the power of KSDAgg.
We find that KSDAgg outperforms other state-of-the-art adaptive KSD-based goodness-of-fit testing procedures.
arXiv Detail & Related papers (2022-02-02T00:33:09Z) - Generalised Kernel Stein Discrepancy(GKSD): A Unifying Approach for
Non-parametric Goodness-of-fit Testing [5.885020100736158]
Non-parametric goodness-of-fit testing procedures based on kernel Stein discrepancies (KSD) are promising approaches to validate general unnormalised distributions.
We propose a unifying framework, the generalised kernel Stein discrepancy (GKSD), to theoretically compare and interpret different Stein operators in performing the KSD-based goodness-of-fit tests.
arXiv Detail & Related papers (2021-06-23T00:44:31Z) - Stein Variational Inference for Discrete Distributions [70.19352762933259]
We propose a simple yet general framework that transforms discrete distributions to equivalent piecewise continuous distributions.
Our method outperforms traditional algorithms such as Gibbs sampling and discontinuous Hamiltonian Monte Carlo.
We demonstrate that our method provides a promising tool for learning ensembles of binarized neural network (BNN)
In addition, such transform can be straightforwardly employed in gradient-free kernelized Stein discrepancy to perform goodness-of-fit (GOF) test on discrete distributions.
arXiv Detail & Related papers (2020-03-01T22:45:41Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.