A kernel Stein test of goodness of fit for sequential models
- URL: http://arxiv.org/abs/2210.10741v3
- Date: Thu, 13 Jul 2023 16:09:53 GMT
- Title: A kernel Stein test of goodness of fit for sequential models
- Authors: Jerome Baum and Heishiro Kanagawa and Arthur Gretton
- Abstract summary: The proposed measure is an instance of the kernel Stein discrepancy (KSD), which has been used to construct goodness-of-fit tests for unnormalized densities.
We extend the KSD to the variable-dimension setting by identifying appropriate Stein operators, and propose a novel KSD goodness-of-fit test.
Our test is shown to perform well in practice on discrete sequential data benchmarks.
- Score: 19.8408003104988
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We propose a goodness-of-fit measure for probability densities modeling
observations with varying dimensionality, such as text documents of differing
lengths or variable-length sequences. The proposed measure is an instance of
the kernel Stein discrepancy (KSD), which has been used to construct
goodness-of-fit tests for unnormalized densities. The KSD is defined by its
Stein operator: current operators used in testing apply to fixed-dimensional
spaces. As our main contribution, we extend the KSD to the variable-dimension
setting by identifying appropriate Stein operators, and propose a novel KSD
goodness-of-fit test. As with the previous variants, the proposed KSD does not
require the density to be normalized, allowing the evaluation of a large class
of models. Our test is shown to perform well in practice on discrete sequential
data benchmarks.
Related papers
- Minimax Optimal Goodness-of-Fit Testing with Kernel Stein Discrepancy [13.429541377715298]
We explore the minimax optimality of goodness-of-fit tests on general domains using the kernelized Stein discrepancy (KSD)
The KSD framework offers a flexible approach for goodness-of-fit testing, avoiding strong distributional assumptions.
We introduce an adaptive test capable of achieving minimax optimality up to a logarithmic factor by adapting to unknown parameters.
arXiv Detail & Related papers (2024-04-12T07:06:12Z) - Using Perturbation to Improve Goodness-of-Fit Tests based on Kernelized
Stein Discrepancy [3.78967502155084]
Kernelized Stein discrepancy (KSD) is a score-based discrepancy widely used in goodness-of-fit tests.
We show theoretically and empirically that the KSD test can suffer from low power when the target and the alternative distributions have the same well-separated modes but differ in mixing proportions.
arXiv Detail & Related papers (2023-04-28T11:13:18Z) - AdaNPC: Exploring Non-Parametric Classifier for Test-Time Adaptation [64.9230895853942]
Domain generalization can be arbitrarily hard without exploiting target domain information.
Test-time adaptive (TTA) methods are proposed to address this issue.
In this work, we adopt Non-Parametric to perform the test-time Adaptation (AdaNPC)
arXiv Detail & Related papers (2023-04-25T04:23:13Z) - Controlling Moments with Kernel Stein Discrepancies [74.82363458321939]
Kernel Stein discrepancies (KSDs) measure the quality of a distributional approximation.
We first show that standard KSDs used for weak convergence control fail to control moment convergence.
We then provide sufficient conditions under which alternative diffusion KSDs control both moment and weak convergence.
arXiv Detail & Related papers (2022-11-10T08:24:52Z) - Concrete Score Matching: Generalized Score Matching for Discrete Data [109.12439278055213]
"Concrete score" is a generalization of the (Stein) score for discrete settings.
"Concrete Score Matching" is a framework to learn such scores from samples.
arXiv Detail & Related papers (2022-11-02T00:41:37Z) - A Fourier representation of kernel Stein discrepancy with application to
Goodness-of-Fit tests for measures on infinite dimensional Hilbert spaces [6.437931786032493]
Kernel Stein discrepancy (KSD) is a kernel-based measure of discrepancy between probability measures.
We provide the first analysis of KSD in the generality of data lying in a separable Hilbert space.
This allows us to prove that KSD can separate measures and thus is valid to use in practice.
arXiv Detail & Related papers (2022-06-09T15:04:18Z) - Generalised Kernel Stein Discrepancy(GKSD): A Unifying Approach for
Non-parametric Goodness-of-fit Testing [5.885020100736158]
Non-parametric goodness-of-fit testing procedures based on kernel Stein discrepancies (KSD) are promising approaches to validate general unnormalised distributions.
We propose a unifying framework, the generalised kernel Stein discrepancy (GKSD), to theoretically compare and interpret different Stein operators in performing the KSD-based goodness-of-fit tests.
arXiv Detail & Related papers (2021-06-23T00:44:31Z) - Uncertainty Inspired RGB-D Saliency Detection [70.50583438784571]
We propose the first framework to employ uncertainty for RGB-D saliency detection by learning from the data labeling process.
Inspired by the saliency data labeling process, we propose a generative architecture to achieve probabilistic RGB-D saliency detection.
Results on six challenging RGB-D benchmark datasets show our approach's superior performance in learning the distribution of saliency maps.
arXiv Detail & Related papers (2020-09-07T13:01:45Z) - Sliced Kernelized Stein Discrepancy [17.159499204595527]
Kernelized Stein discrepancy (KSD) is extensively used in goodness-of-fit tests and model learning.
We propose the sliced Stein discrepancy and its scalable and kernelized variants, which employ kernel-based test functions defined on the optimal one-dimensional projections.
For model learning, we show its advantages over existing Stein discrepancy baselines by training independent component analysis models with different discrepancies.
arXiv Detail & Related papers (2020-06-30T04:58:55Z) - Density of States Estimation for Out-of-Distribution Detection [69.90130863160384]
DoSE is the density of states estimator.
We demonstrate DoSE's state-of-the-art performance against other unsupervised OOD detectors.
arXiv Detail & Related papers (2020-06-16T16:06:25Z) - A Kernel Stein Test for Comparing Latent Variable Models [48.32146056855925]
We propose a kernel-based nonparametric test of relative goodness of fit, where the goal is to compare two models, both of which may have unobserved latent variables.
We show that our test significantly outperforms the relative Maximum Mean Discrepancy test, which is based on samples from the models and does not exploit the latent structure.
arXiv Detail & Related papers (2019-07-01T07:46:16Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.