Kernel Robust Hypothesis Testing
- URL: http://arxiv.org/abs/2203.12777v3
- Date: Sat, 5 Aug 2023 16:17:24 GMT
- Title: Kernel Robust Hypothesis Testing
- Authors: Zhongchang Sun and Shaofeng Zou
- Abstract summary: In this paper, uncertainty sets are constructed in a data-driven manner using kernel method.
The goal is to design a test that performs well under the worst-case distributions over the uncertainty sets.
For the Neyman-Pearson setting, the goal is to minimize the worst-case probability of miss detection subject to a constraint on the worst-case probability of false alarm.
- Score: 20.78285964841612
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: The problem of robust hypothesis testing is studied, where under the null and
the alternative hypotheses, the data-generating distributions are assumed to be
in some uncertainty sets, and the goal is to design a test that performs well
under the worst-case distributions over the uncertainty sets. In this paper,
uncertainty sets are constructed in a data-driven manner using kernel method,
i.e., they are centered around empirical distributions of training samples from
the null and alternative hypotheses, respectively; and are constrained via the
distance between kernel mean embeddings of distributions in the reproducing
kernel Hilbert space, i.e., maximum mean discrepancy (MMD). The Bayesian
setting and the Neyman-Pearson setting are investigated. For the Bayesian
setting where the goal is to minimize the worst-case error probability, an
optimal test is firstly obtained when the alphabet is finite. When the alphabet
is infinite, a tractable approximation is proposed to quantify the worst-case
average error probability, and a kernel smoothing method is further applied to
design test that generalizes to unseen samples. A direct robust kernel test is
also proposed and proved to be exponentially consistent. For the Neyman-Pearson
setting, where the goal is to minimize the worst-case probability of miss
detection subject to a constraint on the worst-case probability of false alarm,
an efficient robust kernel test is proposed and is shown to be asymptotically
optimal. Numerical results are provided to demonstrate the performance of the
proposed robust tests.
Related papers
- Robust Kernel Hypothesis Testing under Data Corruption [6.430258446597413]
We propose two general methods for constructing robust permutation tests under data corruption.
We prove their consistency in power under minimal conditions.
This contributes to the practical deployment of hypothesis tests for real-world applications with potential adversarial attacks.
arXiv Detail & Related papers (2024-05-30T10:23:16Z) - Non-Convex Robust Hypothesis Testing using Sinkhorn Uncertainty Sets [18.46110328123008]
We present a new framework to address the non-robust hypothesis testing problem.
The goal is to seek the optimal detector that minimizes the maximum numerical risk.
arXiv Detail & Related papers (2024-03-21T20:29:43Z) - Calibrating Neural Simulation-Based Inference with Differentiable
Coverage Probability [50.44439018155837]
We propose to include a calibration term directly into the training objective of the neural model.
By introducing a relaxation of the classical formulation of calibration error we enable end-to-end backpropagation.
It is directly applicable to existing computational pipelines allowing reliable black-box posterior inference.
arXiv Detail & Related papers (2023-10-20T10:20:45Z) - Sequential Predictive Two-Sample and Independence Testing [114.4130718687858]
We study the problems of sequential nonparametric two-sample and independence testing.
We build upon the principle of (nonparametric) testing by betting.
arXiv Detail & Related papers (2023-04-29T01:30:33Z) - A Data-Driven Approach to Robust Hypothesis Testing Using Sinkhorn
Uncertainty Sets [12.061662346636645]
We seek the worst-case detector over distributional uncertainty sets centered around the empirical distribution from samples using Sinkhorn distance.
Compared with the Wasserstein robust test, the corresponding least favorable distributions are supported beyond the training samples, which provides a more flexible detector.
arXiv Detail & Related papers (2022-02-09T03:26:15Z) - Optimal variance-reduced stochastic approximation in Banach spaces [114.8734960258221]
We study the problem of estimating the fixed point of a contractive operator defined on a separable Banach space.
We establish non-asymptotic bounds for both the operator defect and the estimation error.
arXiv Detail & Related papers (2022-01-21T02:46:57Z) - Optimal policy evaluation using kernel-based temporal difference methods [78.83926562536791]
We use kernel Hilbert spaces for estimating the value function of an infinite-horizon discounted Markov reward process.
We derive a non-asymptotic upper bound on the error with explicit dependence on the eigenvalues of the associated kernel operator.
We prove minimax lower bounds over sub-classes of MRPs.
arXiv Detail & Related papers (2021-09-24T14:48:20Z) - Near-optimal inference in adaptive linear regression [60.08422051718195]
Even simple methods like least squares can exhibit non-normal behavior when data is collected in an adaptive manner.
We propose a family of online debiasing estimators to correct these distributional anomalies in at least squares estimation.
We demonstrate the usefulness of our theory via applications to multi-armed bandit, autoregressive time series estimation, and active learning with exploration.
arXiv Detail & Related papers (2021-07-05T21:05:11Z) - Robust Uncertainty Bounds in Reproducing Kernel Hilbert Spaces: A Convex
Optimization Approach [9.462535418331615]
It is known that out-of-sample bounds can be established at unseen input locations.
We show how computing tight, finite-sample uncertainty bounds amounts to solving parametrically constrained linear programs.
arXiv Detail & Related papers (2021-04-19T19:27:52Z) - Certifying Neural Network Robustness to Random Input Noise from Samples [14.191310794366075]
Methods to certify the robustness of neural networks in the presence of input uncertainty are vital in safety-critical settings.
We propose a novel robustness certification method that upper bounds the probability of misclassification when the input noise follows an arbitrary probability distribution.
arXiv Detail & Related papers (2020-10-15T05:27:21Z) - Distributionally Robust Bayesian Quadrature Optimization [60.383252534861136]
We study BQO under distributional uncertainty in which the underlying probability distribution is unknown except for a limited set of its i.i.d. samples.
A standard BQO approach maximizes the Monte Carlo estimate of the true expected objective given the fixed sample set.
We propose a novel posterior sampling based algorithm, namely distributionally robust BQO (DRBQO) for this purpose.
arXiv Detail & Related papers (2020-01-19T12:00:33Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.