The Complexity of Adversarially Robust Proper Learning of Halfspaces
with Agnostic Noise
- URL: http://arxiv.org/abs/2007.15220v1
- Date: Thu, 30 Jul 2020 04:18:51 GMT
- Title: The Complexity of Adversarially Robust Proper Learning of Halfspaces
with Agnostic Noise
- Authors: Ilias Diakonikolas, Daniel M. Kane, Pasin Manurangsi
- Abstract summary: We study the computational complexity of adversarially robust proper learning of halfspaces in the distribution-independent PAC model.
We give a computationally efficient learning algorithm and a nearly matching computational hardness result for this problem.
- Score: 67.27523616312428
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We study the computational complexity of adversarially robust proper learning
of halfspaces in the distribution-independent agnostic PAC model, with a focus
on $L_p$ perturbations. We give a computationally efficient learning algorithm
and a nearly matching computational hardness result for this problem. An
interesting implication of our findings is that the $L_{\infty}$ perturbations
case is provably computationally harder than the case $2 \leq p < \infty$.
Related papers
- Near-Optimal Bounds for Learning Gaussian Halfspaces with Random
Classification Noise [50.64137465792738]
We show that any efficient SQ algorithm for the problem requires sample complexity at least $Omega(d1/2/(maxp, epsilon)2)$.
Our lower bound suggests that this quadratic dependence on $1/epsilon$ is inherent for efficient algorithms.
arXiv Detail & Related papers (2023-07-13T18:59:28Z) - Information-Computation Tradeoffs for Learning Margin Halfspaces with
Random Classification Noise [50.64137465792738]
We study the problem of PAC $gamma$-margin halfspaces with Random Classification Noise.
We establish an information-computation tradeoff suggesting an inherent gap between the sample complexity of the problem and the sample complexity of computationally efficient algorithms.
arXiv Detail & Related papers (2023-06-28T16:33:39Z) - Sample Complexity Bounds for Robustly Learning Decision Lists against
Evasion Attacks [25.832511407411637]
A fundamental problem in adversarial machine learning is to quantify how much training data is needed in the presence of evasion attacks.
We work with probability distributions on the input data that satisfy a Lipschitz condition: nearby points have similar probability.
For every fixed $k$ the class of $k$-decision lists has sample complexity against a $log(n)$-bounded adversary.
arXiv Detail & Related papers (2022-05-12T14:40:18Z) - Sample-Efficient Reinforcement Learning of Undercomplete POMDPs [91.40308354344505]
This work shows that these hardness barriers do not preclude efficient reinforcement learning for rich and interesting subclasses of Partially Observable Decision Processes (POMDPs)
We present a sample-efficient algorithm, OOM-UCB, for episodic finite undercomplete POMDPs, where the number of observations is larger than the number of latent states and where exploration is essential for learning, thus distinguishing our results from prior works.
arXiv Detail & Related papers (2020-06-22T17:58:54Z) - Learning Halfspaces with Tsybakov Noise [50.659479930171585]
We study the learnability of halfspaces in the presence of Tsybakov noise.
We give an algorithm that achieves misclassification error $epsilon$ with respect to the true halfspace.
arXiv Detail & Related papers (2020-06-11T14:25:02Z) - Proper Learning, Helly Number, and an Optimal SVM Bound [29.35254938542589]
We characterize classes for which the optimal sample complexity can be achieved by a proper learning algorithm.
We show that the dual Helly number is bounded if and only if there is a proper joint dependence on $epsilon$ and $delta$.
arXiv Detail & Related papers (2020-05-24T18:11:57Z) - Efficiently Learning Adversarially Robust Halfspaces with Noise [69.01459748050453]
We study the problem of learning adversarially robust halfspaces in the distribution-independent setting.
In the realizable setting, we provide necessary and sufficient conditions on the adversarial perturbation sets under which halfspaces are efficiently robustly learnable.
arXiv Detail & Related papers (2020-05-15T17:13:54Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.