Efficiently Learning Adversarially Robust Halfspaces with Noise
- URL: http://arxiv.org/abs/2005.07652v1
- Date: Fri, 15 May 2020 17:13:54 GMT
- Title: Efficiently Learning Adversarially Robust Halfspaces with Noise
- Authors: Omar Montasser, Surbhi Goel, Ilias Diakonikolas, Nathan Srebro
- Abstract summary: We study the problem of learning adversarially robust halfspaces in the distribution-independent setting.
In the realizable setting, we provide necessary and sufficient conditions on the adversarial perturbation sets under which halfspaces are efficiently robustly learnable.
- Score: 69.01459748050453
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We study the problem of learning adversarially robust halfspaces in the
distribution-independent setting. In the realizable setting, we provide
necessary and sufficient conditions on the adversarial perturbation sets under
which halfspaces are efficiently robustly learnable. In the presence of random
label noise, we give a simple computationally efficient algorithm for this
problem with respect to any $\ell_p$-perturbation.
Related papers
- Efficient PAC Learning of Halfspaces with Constant Malicious Noise Rate [14.461730902968862]
We study the problem of computationally efficient PAC learning of halfspaces in the presence of malicious noise.
We show that it is possible to achieve em constant noise tolerance by minimizing a reweighted hinge loss.
arXiv Detail & Related papers (2024-10-02T02:38:33Z) - The Optimal Noise in Noise-Contrastive Learning Is Not What You Think [80.07065346699005]
We show that deviating from this assumption can actually lead to better statistical estimators.
In particular, the optimal noise distribution is different from the data's and even from a different family.
arXiv Detail & Related papers (2022-03-02T13:59:20Z) - High Probability Complexity Bounds for Non-Smooth Stochastic Optimization with Heavy-Tailed Noise [51.31435087414348]
It is essential to theoretically guarantee that algorithms provide small objective residual with high probability.
Existing methods for non-smooth convex optimization have complexity bounds with dependence on confidence level.
We propose novel stepsize rules for two methods with gradient clipping.
arXiv Detail & Related papers (2021-06-10T17:54:21Z) - The Complexity of Adversarially Robust Proper Learning of Halfspaces
with Agnostic Noise [67.27523616312428]
We study the computational complexity of adversarially robust proper learning of halfspaces in the distribution-independent PAC model.
We give a computationally efficient learning algorithm and a nearly matching computational hardness result for this problem.
arXiv Detail & Related papers (2020-07-30T04:18:51Z) - Learning Halfspaces with Tsybakov Noise [50.659479930171585]
We study the learnability of halfspaces in the presence of Tsybakov noise.
We give an algorithm that achieves misclassification error $epsilon$ with respect to the true halfspace.
arXiv Detail & Related papers (2020-06-11T14:25:02Z) - Attribute-Efficient Learning of Halfspaces with Malicious Noise:
Near-Optimal Label Complexity and Noise Tolerance [21.76197397540397]
This paper is concerned with computationally efficient learning of homogeneous sparse halfspaces in $mathbbRd$ under noise.
We show that the sample complexity is $tildeObig(frac 1 epsilon s2 log5 d big)$ which also enjoys the attribute efficiency.
arXiv Detail & Related papers (2020-06-06T04:57:39Z) - Learning Halfspaces with Massart Noise Under Structured Distributions [46.37328271312332]
We solve the problem of learningspaces with Massart noise in the distribution-specific PAC model.
We give the first efficient algorithm for a broad family problem with respect to log-con.
arXiv Detail & Related papers (2020-02-13T17:02:37Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.