An alternative to SVM Method for Data Classification
- URL: http://arxiv.org/abs/2308.11579v1
- Date: Sun, 20 Aug 2023 14:09:01 GMT
- Title: An alternative to SVM Method for Data Classification
- Authors: Lakhdar Remaki
- Abstract summary: Support vector machine (SVM) is a popular kernel method for data classification.
The method suffers from some weaknesses including; time processing, risk of failure of the optimization process for high dimension cases.
In this paper an alternative method is proposed having a similar performance, with a sensitive improvement of the aforementioned shortcomings.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Support vector machine (SVM), is a popular kernel method for data
classification that demonstrated its efficiency for a large range of practical
applications. The method suffers, however, from some weaknesses including; time
processing, risk of failure of the optimization process for high dimension
cases, generalization to multi-classes, unbalanced classes, and dynamic
classification. In this paper an alternative method is proposed having a
similar performance, with a sensitive improvement of the aforementioned
shortcomings. The new method is based on a minimum distance to optimal
subspaces containing the mapped original classes.
Related papers
- Anti-Collapse Loss for Deep Metric Learning Based on Coding Rate Metric [99.19559537966538]
DML aims to learn a discriminative high-dimensional embedding space for downstream tasks like classification, clustering, and retrieval.
To maintain the structure of embedding space and avoid feature collapse, we propose a novel loss function called Anti-Collapse Loss.
Comprehensive experiments on benchmark datasets demonstrate that our proposed method outperforms existing state-of-the-art methods.
arXiv Detail & Related papers (2024-07-03T13:44:20Z) - Smooth Ranking SVM via Cutting-Plane Method [6.946903076677842]
We develop a prototype learning approach that relies on cutting-plane method, similar to Ranking SVM, to maximize AUC.
Our algorithm learns simpler models by iteratively introducing cutting planes, thus overfitting is prevented in an unconventional way.
Based on the experiments conducted on 73 binary classification datasets, our method yields the best test AUC in 25 datasets among its relevant competitors.
arXiv Detail & Related papers (2024-01-25T18:47:23Z) - Multi-class Support Vector Machine with Maximizing Minimum Margin [67.51047882637688]
Support Vector Machine (SVM) is a prominent machine learning technique widely applied in pattern recognition tasks.
We propose a novel method for multi-class SVM that incorporates pairwise class loss considerations and maximizes the minimum margin.
Empirical evaluations demonstrate the effectiveness and superiority of our proposed method over existing multi-classification methods.
arXiv Detail & Related papers (2023-12-11T18:09:55Z) - Efficient Training of One Class Classification-SVMs [0.0]
This study examines the use of a highly effective training method to conduct one-class classification.
In this paper, an effective algorithm for dual soft-margin one-class SVM training is presented.
arXiv Detail & Related papers (2023-09-28T15:35:16Z) - Majorization-Minimization for sparse SVMs [46.99165837639182]
Support Vector Machines (SVMs) were introduced for performing binary classification tasks, under a supervised framework, several decades ago.
They often outperform other supervised methods and remain one of the most popular approaches in the machine learning arena.
In this work, we investigate the training of SVMs through a smooth sparse-promoting-regularized squared hinge loss minimization.
arXiv Detail & Related papers (2023-08-31T17:03:16Z) - Numerical Optimizations for Weighted Low-rank Estimation on Language
Model [73.12941276331316]
Singular value decomposition (SVD) is one of the most popular compression methods that approximates a target matrix with smaller matrices.
Standard SVD treats the parameters within the matrix with equal importance, which is a simple but unrealistic assumption.
We show that our method can perform better than current SOTA methods in neural-based language models.
arXiv Detail & Related papers (2022-11-02T00:58:02Z) - Handling Imbalanced Classification Problems With Support Vector Machines
via Evolutionary Bilevel Optimization [73.17488635491262]
Support vector machines (SVMs) are popular learning algorithms to deal with binary classification problems.
This article introduces EBCS-SVM: evolutionary bilevel cost-sensitive SVMs.
arXiv Detail & Related papers (2022-04-21T16:08:44Z) - A fast learning algorithm for One-Class Slab Support Vector Machines [1.1613446814180841]
This paper proposes fast training method for One Class Slab SVMs using an updated Sequential Minimal Optimization (SMO)
The results indicate that this training method scales better to large sets of training data than other Quadratic Programming (QP) solvers.
arXiv Detail & Related papers (2020-11-06T09:16:39Z) - Selective Classification via One-Sided Prediction [54.05407231648068]
One-sided prediction (OSP) based relaxation yields an SC scheme that attains near-optimal coverage in the practically relevant high target accuracy regime.
We theoretically derive bounds generalization for SC and OSP, and empirically we show that our scheme strongly outperforms state of the art methods in coverage at small error levels.
arXiv Detail & Related papers (2020-10-15T16:14:27Z) - A novel embedded min-max approach for feature selection in nonlinear
support vector machine classification [0.0]
We propose an embedded feature selection method based on a min-max optimization problem.
By leveraging duality theory, we equivalently reformulate the min-max problem and solve it without further ado.
The efficiency and usefulness of our approach are tested on several benchmark data sets.
arXiv Detail & Related papers (2020-04-21T09:40:38Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.