A Stein Goodness of fit Test for Exponential Random Graph Models
- URL: http://arxiv.org/abs/2103.00580v1
- Date: Sun, 28 Feb 2021 18:16:41 GMT
- Title: A Stein Goodness of fit Test for Exponential Random Graph Models
- Authors: Wenkai Xu and Gesine Reinert
- Abstract summary: We propose and analyse a novel nonparametric goodness of fit testing procedure for exchangeable exponential random graph models.
The test determines how likely it is that the observation is generated from a target unnormalised ERGM density.
- Score: 5.885020100736158
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We propose and analyse a novel nonparametric goodness of fit testing
procedure for exchangeable exponential random graph models (ERGMs) when a
single network realisation is observed. The test determines how likely it is
that the observation is generated from a target unnormalised ERGM density. Our
test statistics are derived from a kernel Stein discrepancy, a divergence
constructed via Steins method using functions in a reproducing kernel Hilbert
space, combined with a discrete Stein operator for ERGMs. The test is a Monte
Carlo test based on simulated networks from the target ERGM. We show
theoretical properties for the testing procedure for a class of ERGMs.
Simulation studies and real network applications are presented.
Related papers
- A Kernel-Based Conditional Two-Sample Test Using Nearest Neighbors (with Applications to Calibration, Regression Curves, and Simulation-Based Inference) [3.622435665395788]
We introduce a kernel-based measure for detecting differences between two conditional distributions.
When the two conditional distributions are the same, the estimate has a Gaussian limit and its variance has a simple form that can be easily estimated from the data.
We also provide a resampling based test using our estimate that applies to the conditional goodness-of-fit problem.
arXiv Detail & Related papers (2024-07-23T15:04:38Z) - SteinGen: Generating Fidelitous and Diverse Graph Samples [11.582357781579997]
We introduce SteinGen to generate graphs from only one observed graph.
We show that SteinGen yields high distributional similarity (high fidelity) to the original data, combined with high sample diversity.
arXiv Detail & Related papers (2024-03-27T13:59:05Z) - On RKHS Choices for Assessing Graph Generators via Kernel Stein
Statistics [8.987015146366216]
We assess the effect of RKHS choice for KSD tests of random networks models.
We investigate the power performance and the computational runtime of the test in different scenarios.
arXiv Detail & Related papers (2022-10-11T19:23:33Z) - A Fourier representation of kernel Stein discrepancy with application to
Goodness-of-Fit tests for measures on infinite dimensional Hilbert spaces [6.437931786032493]
Kernel Stein discrepancy (KSD) is a kernel-based measure of discrepancy between probability measures.
We provide the first analysis of KSD in the generality of data lying in a separable Hilbert space.
This allows us to prove that KSD can separate measures and thus is valid to use in practice.
arXiv Detail & Related papers (2022-06-09T15:04:18Z) - Nonparametric Conditional Local Independence Testing [69.31200003384122]
Conditional local independence is an independence relation among continuous time processes.
No nonparametric test of conditional local independence has been available.
We propose such a nonparametric test based on double machine learning.
arXiv Detail & Related papers (2022-03-25T10:31:02Z) - Composite Goodness-of-fit Tests with Kernels [19.744607024807188]
We propose a kernel-based hypothesis tests for the challenging composite testing problem.
Our tests make use of minimum distance estimators based on the maximum mean discrepancy and the kernel Stein discrepancy.
As our main result, we show that we are able to estimate the parameter and conduct our test on the same data, while maintaining a correct test level.
arXiv Detail & Related papers (2021-11-19T15:25:06Z) - Inverting brain grey matter models with likelihood-free inference: a
tool for trustable cytoarchitecture measurements [62.997667081978825]
characterisation of the brain grey matter cytoarchitecture with quantitative sensitivity to soma density and volume remains an unsolved challenge in dMRI.
We propose a new forward model, specifically a new system of equations, requiring a few relatively sparse b-shells.
We then apply modern tools from Bayesian analysis known as likelihood-free inference (LFI) to invert our proposed model.
arXiv Detail & Related papers (2021-11-15T09:08:27Z) - Deterministic Gibbs Sampling via Ordinary Differential Equations [77.42706423573573]
This paper presents a general construction of deterministic measure-preserving dynamics using autonomous ODEs and tools from differential geometry.
We show how Hybrid Monte Carlo and other deterministic samplers follow as special cases of our theory.
arXiv Detail & Related papers (2021-06-18T15:36:09Z) - Continual Learning with Fully Probabilistic Models [70.3497683558609]
We present an approach for continual learning based on fully probabilistic (or generative) models of machine learning.
We propose a pseudo-rehearsal approach using a Gaussian Mixture Model (GMM) instance for both generator and classifier functionalities.
We show that GMR achieves state-of-the-art performance on common class-incremental learning problems at very competitive time and memory complexity.
arXiv Detail & Related papers (2021-04-19T12:26:26Z) - Leveraging Global Parameters for Flow-based Neural Posterior Estimation [90.21090932619695]
Inferring the parameters of a model based on experimental observations is central to the scientific method.
A particularly challenging setting is when the model is strongly indeterminate, i.e., when distinct sets of parameters yield identical observations.
We present a method for cracking such indeterminacy by exploiting additional information conveyed by an auxiliary set of observations sharing global parameters.
arXiv Detail & Related papers (2021-02-12T12:23:13Z) - Uncertainty Inspired RGB-D Saliency Detection [70.50583438784571]
We propose the first framework to employ uncertainty for RGB-D saliency detection by learning from the data labeling process.
Inspired by the saliency data labeling process, we propose a generative architecture to achieve probabilistic RGB-D saliency detection.
Results on six challenging RGB-D benchmark datasets show our approach's superior performance in learning the distribution of saliency maps.
arXiv Detail & Related papers (2020-09-07T13:01:45Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.