A Permutation-free Kernel Two-Sample Test
- URL: http://arxiv.org/abs/2211.14908v1
- Date: Sun, 27 Nov 2022 18:15:52 GMT
- Title: A Permutation-free Kernel Two-Sample Test
- Authors: Shubhanshu Shekhar, Ilmun Kim, Aaditya Ramdas
- Abstract summary: We propose a new quadratic-time MMD test statistic based on sample-splitting and studentization.
For large sample sizes, our new cross-MMD provides a significant speedup over the MMD, for only a slight loss in power.
- Score: 36.50719125230106
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: The kernel Maximum Mean Discrepancy~(MMD) is a popular multivariate distance
metric between distributions that has found utility in two-sample testing. The
usual kernel-MMD test statistic is a degenerate U-statistic under the null, and
thus it has an intractable limiting distribution. Hence, to design a
level-$\alpha$ test, one usually selects the rejection threshold as the
$(1-\alpha)$-quantile of the permutation distribution. The resulting
nonparametric test has finite-sample validity but suffers from large
computational cost, since every permutation takes quadratic time. We propose
the cross-MMD, a new quadratic-time MMD test statistic based on
sample-splitting and studentization. We prove that under mild assumptions, the
cross-MMD has a limiting standard Gaussian distribution under the null.
Importantly, we also show that the resulting test is consistent against any
fixed alternative, and when using the Gaussian kernel, it has minimax
rate-optimal power against local alternatives. For large sample sizes, our new
cross-MMD provides a significant speedup over the MMD, for only a slight loss
in power.
Related papers
- Computational-Statistical Trade-off in Kernel Two-Sample Testing with Random Fourier Features [3.744589644319257]
The Maximum Mean Discrepancy (MMD) test has emerged as an effective tool for handling complex and high-dimensional data.
It has been unclear whether it is possible to attain the same power guarantee as the MMD test at sub-quadratic time cost.
We show that it is possible to attain the same minimax separation rates as the MMD test within sub-quadratic time.
arXiv Detail & Related papers (2024-07-12T04:08:01Z) - Partial identification of kernel based two sample tests with mismeasured
data [5.076419064097733]
Two-sample tests such as the Maximum Mean Discrepancy (MMD) are often used to detect differences between two distributions in machine learning applications.
We study the estimation of the MMD under $epsilon$-contamination, where a possibly non-random $epsilon$ proportion of one distribution is erroneously grouped with the other.
We propose a method to estimate these bounds, and show that it gives estimates that converge to the sharpest possible bounds on the MMD as sample size increases.
arXiv Detail & Related papers (2023-08-07T13:21:58Z) - Stochastic Approximation Approaches to Group Distributionally Robust
Optimization [96.26317627118912]
Group distributionally robust optimization (GDRO)
Online learning techniques to reduce the number of samples required in each round from $m$ to $1$, keeping the same sample.
A novel formulation of weighted GDRO, which allows us to derive distribution-dependent convergence rates.
arXiv Detail & Related papers (2023-02-18T09:24:15Z) - Spectral Regularized Kernel Two-Sample Tests [7.915420897195129]
We show the popular MMD (maximum mean discrepancy) two-sample test to be not optimal in terms of the separation boundary measured in Hellinger distance.
We propose a modification to the MMD test based on spectral regularization and prove the proposed test to be minimax optimal with a smaller separation boundary than that achieved by the MMD test.
Our results hold for the permutation variant of the test where the test threshold is chosen elegantly through the permutation of the samples.
arXiv Detail & Related papers (2022-12-19T00:42:21Z) - Efficient Aggregated Kernel Tests using Incomplete $U$-statistics [22.251118308736327]
Three proposed tests aggregate over several kernel bandwidths to detect departures from the null on various scales.
We show that our proposed linear-time aggregated tests obtain higher power than current state-of-the-art linear-time kernel tests.
arXiv Detail & Related papers (2022-06-18T12:30:06Z) - MMD Aggregated Two-Sample Test [31.116276769013204]
We propose two novel non-parametric two-sample kernel tests based on the Mean Maximum Discrepancy (MMD)
First, for a fixed kernel, we construct an MMD test using either permutations or a wild bootstrap, two popular numerical procedures to determine the test threshold.
We prove that this test controls the level non-asymptotically, and achieves the minimax rate over Sobolev balls, up to an iterated logarithmic term.
arXiv Detail & Related papers (2021-10-28T12:47:49Z) - A Note on Optimizing Distributions using Kernel Mean Embeddings [94.96262888797257]
Kernel mean embeddings represent probability measures by their infinite-dimensional mean embeddings in a reproducing kernel Hilbert space.
We show that when the kernel is characteristic, distributions with a kernel sum-of-squares density are dense.
We provide algorithms to optimize such distributions in the finite-sample setting.
arXiv Detail & Related papers (2021-06-18T08:33:45Z) - Maximum Mean Discrepancy Test is Aware of Adversarial Attacks [122.51040127438324]
The maximum mean discrepancy (MMD) test could in principle detect any distributional discrepancy between two datasets.
It has been shown that the MMD test is unaware of adversarial attacks.
arXiv Detail & Related papers (2020-10-22T03:42:12Z) - Optimal Testing of Discrete Distributions with High Probability [49.19942805582874]
We study the problem of testing discrete distributions with a focus on the high probability regime.
We provide the first algorithms for closeness and independence testing that are sample-optimal, within constant factors.
arXiv Detail & Related papers (2020-09-14T16:09:17Z) - Computationally efficient sparse clustering [67.95910835079825]
We provide a finite sample analysis of a new clustering algorithm based on PCA.
We show that it achieves the minimax optimal misclustering rate in the regime $|theta infty$.
arXiv Detail & Related papers (2020-05-21T17:51:30Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.