Robust Support Vector Machines for Imbalanced and Noisy Data via Benders Decomposition
- URL: http://arxiv.org/abs/2503.14873v1
- Date: Wed, 19 Mar 2025 04:03:39 GMT
- Title: Robust Support Vector Machines for Imbalanced and Noisy Data via Benders Decomposition
- Authors: Seyed Mojtaba Mohasel, Hamidreza Koosha,
- Abstract summary: This study introduces a novel formulation to enhance Support Vector Machines (SVMs) in handling class imbalance and noise.<n>Unlike the conventional Soft Margin SVM, which penalizes the magnitude of constraint violations, the proposed model quantifies the number of violations and aims to minimize their frequency.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: This study introduces a novel formulation to enhance Support Vector Machines (SVMs) in handling class imbalance and noise. Unlike the conventional Soft Margin SVM, which penalizes the magnitude of constraint violations, the proposed model quantifies the number of violations and aims to minimize their frequency. To achieve this, a binary variable is incorporated into the objective function of the primal SVM formulation, replacing the traditional slack variable. Furthermore, each misclassified sample is assigned a priority and an associated constraint. The resulting formulation is a mixed-integer programming model, efficiently solved using Benders decomposition. The proposed model's performance was benchmarked against existing models, including Soft Margin SVM, weighted SVM, and NuSVC. Two primary hypotheses were examined: 1) The proposed model improves the F1-score for the minority class in imbalanced classification tasks. 2) The proposed model enhances classification accuracy in noisy datasets. These hypotheses were evaluated using a Wilcoxon test across multiple publicly available datasets from the OpenML repository. The results supported both hypotheses (\( p < 0.05 \)). In addition, the proposed model exhibited several interesting properties, such as improved robustness to noise, a decision boundary shift favoring the minority class, a reduced number of support vectors, and decreased prediction time. The open-source Python implementation of the proposed SVM model is available.
Related papers
- EquiTabPFN: A Target-Permutation Equivariant Prior Fitted Networks [55.214444066134114]
In this study, we identify this oversight as a source of incompressible error, termed the equivariance gap, which introduces instability in predictions.
To mitigate these issues, we propose a novel model designed to preserve equivariance across output dimensions.
arXiv Detail & Related papers (2025-02-10T17:11:20Z) - Enhancing Imbalance Learning: A Novel Slack-Factor Fuzzy SVM Approach [0.0]
Support vector machines (FSVMs) address class imbalance by assigning varying fuzzy memberships to samples.
The recently developed slack-factor-based FSVM (SFFSVM) improves traditional FSVMs by using slack factors to adjust fuzzy memberships based on misclassification likelihood.
We propose an improved slack-factor-based FSVM (ISFFSVM) that introduces a novel location parameter.
arXiv Detail & Related papers (2024-11-26T05:47:01Z) - Projection based fuzzy least squares twin support vector machine for
class imbalance problems [0.9668407688201361]
We propose a novel fuzzy based approach to deal with class imbalanced as well noisy datasets.
The proposed algorithms are evaluated on several benchmark and synthetic datasets.
arXiv Detail & Related papers (2023-09-27T14:28:48Z) - Consensus-Adaptive RANSAC [104.87576373187426]
We propose a new RANSAC framework that learns to explore the parameter space by considering the residuals seen so far via a novel attention layer.
The attention mechanism operates on a batch of point-to-model residuals, and updates a per-point estimation state to take into account the consensus found through a lightweight one-step transformer.
arXiv Detail & Related papers (2023-07-26T08:25:46Z) - Robust Twin Parametric Margin Support Vector Machine for Multiclass Classification [0.0]
We present novel Twin Parametric Margin Support Vector Machine (TPMSVM) models to tackle the problem of multiclass classification.
We construct bounded-by-norm uncertainty sets around each sample and derive the robust counterpart of deterministic models.
We test the proposed TPMSVM methodology on real-world datasets, showing the good performance of the approach.
arXiv Detail & Related papers (2023-06-09T19:27:24Z) - Variational Classification [51.2541371924591]
We derive a variational objective to train the model, analogous to the evidence lower bound (ELBO) used to train variational auto-encoders.
Treating inputs to the softmax layer as samples of a latent variable, our abstracted perspective reveals a potential inconsistency.
We induce a chosen latent distribution, instead of the implicit assumption found in a standard softmax layer.
arXiv Detail & Related papers (2023-05-17T17:47:19Z) - Soft-SVM Regression For Binary Classification [0.0]
We introduce a new exponential family based on a convex relaxation of the hinge loss function using softness and class-separation parameters.
This new family, denoted Soft-SVM, allows us to prescribe a generalized linear model that effectively bridges between logistic regression and SVM classification.
arXiv Detail & Related papers (2022-05-24T03:01:35Z) - CARMS: Categorical-Antithetic-REINFORCE Multi-Sample Gradient Estimator [60.799183326613395]
We propose an unbiased estimator for categorical random variables based on multiple mutually negatively correlated (jointly antithetic) samples.
CARMS combines REINFORCE with copula based sampling to avoid duplicate samples and reduce its variance, while keeping the estimator unbiased using importance sampling.
We evaluate CARMS on several benchmark datasets on a generative modeling task, as well as a structured output prediction task, and find it to outperform competing methods including a strong self-control baseline.
arXiv Detail & Related papers (2021-10-26T20:14:30Z) - Weighted Least Squares Twin Support Vector Machine with Fuzzy Rough Set
Theory for Imbalanced Data Classification [0.483420384410068]
Support vector machines (SVMs) are powerful supervised learning tools developed to solve classification problems.
We propose an approach that efficiently used fuzzy rough set theory in weighted least squares twin support vector machine called FRLSTSVM for classification of imbalanced data.
arXiv Detail & Related papers (2021-05-03T22:33:39Z) - Robust Finite Mixture Regression for Heterogeneous Targets [70.19798470463378]
We propose an FMR model that finds sample clusters and jointly models multiple incomplete mixed-type targets simultaneously.
We provide non-asymptotic oracle performance bounds for our model under a high-dimensional learning framework.
The results show that our model can achieve state-of-the-art performance.
arXiv Detail & Related papers (2020-10-12T03:27:07Z) - Learning Noise-Aware Encoder-Decoder from Noisy Labels by Alternating
Back-Propagation for Saliency Detection [54.98042023365694]
We propose a noise-aware encoder-decoder framework to disentangle a clean saliency predictor from noisy training examples.
The proposed model consists of two sub-models parameterized by neural networks.
arXiv Detail & Related papers (2020-07-23T18:47:36Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.