Robust Classification via Support Vector Machines
- URL: http://arxiv.org/abs/2104.13458v1
- Date: Tue, 27 Apr 2021 20:20:12 GMT
- Title: Robust Classification via Support Vector Machines
- Authors: Vali Asimit, Ioannis Kyriakou, Simone Santoni, Salvatore Scognamiglio
and Rui Zhu
- Abstract summary: We propose two robust classifiers under data uncertainty.
The first is called Single Perturbation SVM (SP-SVM) and provides a constructive method by allowing a controlled perturbation to one feature of the data.
The second method is called Extreme Empirical Loss SVM (EEL-SVM) and is based on a new empirical loss estimate, namely, the Extreme Empirical Loss (EEL)
- Score: 1.7520660701924717
- License: http://creativecommons.org/publicdomain/zero/1.0/
- Abstract: The loss function choice for any Support Vector Machine classifier has raised
great interest in the literature due to the lack of robustness of the Hinge
loss, which is the standard loss choice. In this paper, we plan to robustify
the binary classifier by maintaining the overall advantages of the Hinge loss,
rather than modifying this standard choice. We propose two robust classifiers
under data uncertainty. The first is called Single Perturbation SVM (SP-SVM)
and provides a constructive method by allowing a controlled perturbation to one
feature of the data. The second method is called Extreme Empirical Loss SVM
(EEL-SVM) and is based on a new empirical loss estimate, namely, the Extreme
Empirical Loss (EEL), that puts more emphasis on extreme violations of the
classification hyper-plane, rather than taking the usual sample average with
equal importance for all hyper-plane violations. Extensive numerical
investigation reveals the advantages of the two robust classifiers on simulated
data and well-known real datasets.
Related papers
- Intuitionistic Fuzzy Universum Twin Support Vector Machine for Imbalanced Data [0.0]
One of the major difficulties in machine learning methods is categorizing datasets that are imbalanced.
We propose intuitionistic fuzzy universum twin support vector machines for imbalanced data (IFUTSVM-ID)
We use an intuitionistic fuzzy membership scheme to mitigate the impact of noise and outliers.
arXiv Detail & Related papers (2024-10-27T04:25:42Z) - Loss-Free Machine Unlearning [51.34904967046097]
We present a machine unlearning approach that is both retraining- and label-free.
Retraining-free approaches often utilise Fisher information, which is derived from the loss and requires labelled data which may not be available.
We present an extension to the Selective Synaptic Dampening algorithm, substituting the diagonal of the Fisher information matrix for the gradient of the l2 norm of the model output to approximate sensitivity.
arXiv Detail & Related papers (2024-02-29T16:15:34Z) - Projection based fuzzy least squares twin support vector machine for
class imbalance problems [0.9668407688201361]
We propose a novel fuzzy based approach to deal with class imbalanced as well noisy datasets.
The proposed algorithms are evaluated on several benchmark and synthetic datasets.
arXiv Detail & Related papers (2023-09-27T14:28:48Z) - Characterizing the Optimal 0-1 Loss for Multi-class Classification with
a Test-time Attacker [57.49330031751386]
We find achievable information-theoretic lower bounds on loss in the presence of a test-time attacker for multi-class classifiers on any discrete dataset.
We provide a general framework for finding the optimal 0-1 loss that revolves around the construction of a conflict hypergraph from the data and adversarial constraints.
arXiv Detail & Related papers (2023-02-21T15:17:13Z) - Label Distributionally Robust Losses for Multi-class Classification:
Consistency, Robustness and Adaptivity [55.29408396918968]
We study a family of loss functions named label-distributionally robust (LDR) losses for multi-class classification.
Our contributions include both consistency and robustness by establishing top-$k$ consistency of LDR losses for multi-class classification.
We propose a new adaptive LDR loss that automatically adapts the individualized temperature parameter to the noise degree of class label of each instance.
arXiv Detail & Related papers (2021-12-30T00:27:30Z) - Risk Minimization from Adaptively Collected Data: Guarantees for
Supervised and Policy Learning [57.88785630755165]
Empirical risk minimization (ERM) is the workhorse of machine learning, but its model-agnostic guarantees can fail when we use adaptively collected data.
We study a generic importance sampling weighted ERM algorithm for using adaptively collected data to minimize the average of a loss function over a hypothesis class.
For policy learning, we provide rate-optimal regret guarantees that close an open gap in the existing literature whenever exploration decays to zero.
arXiv Detail & Related papers (2021-06-03T09:50:13Z) - Weighted Least Squares Twin Support Vector Machine with Fuzzy Rough Set
Theory for Imbalanced Data Classification [0.483420384410068]
Support vector machines (SVMs) are powerful supervised learning tools developed to solve classification problems.
We propose an approach that efficiently used fuzzy rough set theory in weighted least squares twin support vector machine called FRLSTSVM for classification of imbalanced data.
arXiv Detail & Related papers (2021-05-03T22:33:39Z) - Robustifying Binary Classification to Adversarial Perturbation [45.347651499585055]
In this paper we consider the problem of binary classification with adversarial perturbations.
We introduce a generalization to the max-margin classifier which takes into account the power of the adversary in manipulating the data.
Under some mild assumptions on the loss function, we theoretically show that the gradient descents converge to the RM classifier in its direction.
arXiv Detail & Related papers (2020-10-29T07:20:37Z) - Provable tradeoffs in adversarially robust classification [96.48180210364893]
We develop and leverage new tools, including recent breakthroughs from probability theory on robust isoperimetry.
Our results reveal fundamental tradeoffs between standard and robust accuracy that grow when data is imbalanced.
arXiv Detail & Related papers (2020-06-09T09:58:19Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.