On RKHS Choices for Assessing Graph Generators via Kernel Stein
Statistics
- URL: http://arxiv.org/abs/2210.05746v1
- Date: Tue, 11 Oct 2022 19:23:33 GMT
- Title: On RKHS Choices for Assessing Graph Generators via Kernel Stein
Statistics
- Authors: Moritz Weckbecker, Wenkai Xu, Gesine Reinert
- Abstract summary: We assess the effect of RKHS choice for KSD tests of random networks models.
We investigate the power performance and the computational runtime of the test in different scenarios.
- Score: 8.987015146366216
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Score-based kernelised Stein discrepancy (KSD) tests have emerged as a
powerful tool for the goodness of fit tests, especially in high dimensions;
however, the test performance may depend on the choice of kernels in an
underlying reproducing kernel Hilbert space (RKHS). Here we assess the effect
of RKHS choice for KSD tests of random networks models, developed for
exponential random graph models (ERGMs) in Xu and Reinert (2021)and for
synthetic graph generators in Xu and Reinert (2022). We investigate the power
performance and the computational runtime of the test in different scenarios,
including both dense and sparse graph regimes. Experimental results on kernel
performance for model assessment tasks are shown and discussed on synthetic and
real-world network applications.
Related papers
- Optimal Kernel Choice for Score Function-based Causal Discovery [92.65034439889872]
We propose a kernel selection method within the generalized score function that automatically selects the optimal kernel that best fits the data.
We conduct experiments on both synthetic data and real-world benchmarks, and the results demonstrate that our proposed method outperforms kernel selection methods.
arXiv Detail & Related papers (2024-07-14T09:32:20Z) - Minimax Optimal Goodness-of-Fit Testing with Kernel Stein Discrepancy [13.429541377715298]
We explore the minimax optimality of goodness-of-fit tests on general domains using the kernelized Stein discrepancy (KSD)
The KSD framework offers a flexible approach for goodness-of-fit testing, avoiding strong distributional assumptions.
We introduce an adaptive test capable of achieving minimax optimality up to a logarithmic factor by adapting to unknown parameters.
arXiv Detail & Related papers (2024-04-12T07:06:12Z) - FaDIn: Fast Discretized Inference for Hawkes Processes with General
Parametric Kernels [82.53569355337586]
This work offers an efficient solution to temporal point processes inference using general parametric kernels with finite support.
The method's effectiveness is evaluated by modeling the occurrence of stimuli-induced patterns from brain signals recorded with magnetoencephalography (MEG)
Results show that the proposed approach leads to an improved estimation of pattern latency than the state-of-the-art.
arXiv Detail & Related papers (2022-10-10T12:35:02Z) - A Fourier representation of kernel Stein discrepancy with application to
Goodness-of-Fit tests for measures on infinite dimensional Hilbert spaces [6.437931786032493]
Kernel Stein discrepancy (KSD) is a kernel-based measure of discrepancy between probability measures.
We provide the first analysis of KSD in the generality of data lying in a separable Hilbert space.
This allows us to prove that KSD can separate measures and thus is valid to use in practice.
arXiv Detail & Related papers (2022-06-09T15:04:18Z) - Scaling Structured Inference with Randomization [64.18063627155128]
We propose a family of dynamic programming (RDP) randomized for scaling structured models to tens of thousands of latent states.
Our method is widely applicable to classical DP-based inference.
It is also compatible with automatic differentiation so can be integrated with neural networks seamlessly.
arXiv Detail & Related papers (2021-12-07T11:26:41Z) - Hybrid Random Features [60.116392415715275]
We propose a new class of random feature methods for linearizing softmax and Gaussian kernels called hybrid random features (HRFs)
HRFs automatically adapt the quality of kernel estimation to provide most accurate approximation in the defined regions of interest.
arXiv Detail & Related papers (2021-10-08T20:22:59Z) - Generalised Kernel Stein Discrepancy(GKSD): A Unifying Approach for
Non-parametric Goodness-of-fit Testing [5.885020100736158]
Non-parametric goodness-of-fit testing procedures based on kernel Stein discrepancies (KSD) are promising approaches to validate general unnormalised distributions.
We propose a unifying framework, the generalised kernel Stein discrepancy (GKSD), to theoretically compare and interpret different Stein operators in performing the KSD-based goodness-of-fit tests.
arXiv Detail & Related papers (2021-06-23T00:44:31Z) - A Stein Goodness of fit Test for Exponential Random Graph Models [5.885020100736158]
We propose and analyse a novel nonparametric goodness of fit testing procedure for exchangeable exponential random graph models.
The test determines how likely it is that the observation is generated from a target unnormalised ERGM density.
arXiv Detail & Related papers (2021-02-28T18:16:41Z) - Kernel Stein Generative Modeling [68.03537693810972]
Gradient Langevin Dynamics (SGLD) demonstrates impressive results with energy-based models on high-dimensional and complex data distributions.
Stein Variational Gradient Descent (SVGD) is a deterministic sampling algorithm that iteratively transports a set of particles to approximate a given distribution.
We propose noise conditional kernel SVGD (NCK-SVGD), that works in tandem with the recently introduced Noise Conditional Score Network estimator.
arXiv Detail & Related papers (2020-07-06T21:26:04Z) - Sliced Kernelized Stein Discrepancy [17.159499204595527]
Kernelized Stein discrepancy (KSD) is extensively used in goodness-of-fit tests and model learning.
We propose the sliced Stein discrepancy and its scalable and kernelized variants, which employ kernel-based test functions defined on the optimal one-dimensional projections.
For model learning, we show its advantages over existing Stein discrepancy baselines by training independent component analysis models with different discrepancies.
arXiv Detail & Related papers (2020-06-30T04:58:55Z) - Learning Deep Kernels for Non-Parametric Two-Sample Tests [50.92621794426821]
We propose a class of kernel-based two-sample tests, which aim to determine whether two sets of samples are drawn from the same distribution.
Our tests are constructed from kernels parameterized by deep neural nets, trained to maximize test power.
arXiv Detail & Related papers (2020-02-21T03:54:23Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.