Multi-class Support Vector Machine with Maximizing Minimum Margin
- URL: http://arxiv.org/abs/2312.06578v2
- Date: Fri, 15 Dec 2023 02:50:34 GMT
- Title: Multi-class Support Vector Machine with Maximizing Minimum Margin
- Authors: Feiping Nie, Zhezheng Hao, Rong Wang
- Abstract summary: Support Vector Machine (SVM) is a prominent machine learning technique widely applied in pattern recognition tasks.
We propose a novel method for multi-class SVM that incorporates pairwise class loss considerations and maximizes the minimum margin.
Empirical evaluations demonstrate the effectiveness and superiority of our proposed method over existing multi-classification methods.
- Score: 67.51047882637688
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Support Vector Machine (SVM) stands out as a prominent machine learning
technique widely applied in practical pattern recognition tasks. It achieves
binary classification by maximizing the "margin", which represents the minimum
distance between instances and the decision boundary. Although many efforts
have been dedicated to expanding SVM for multi-class case through strategies
such as one versus one and one versus the rest, satisfactory solutions remain
to be developed. In this paper, we propose a novel method for multi-class SVM
that incorporates pairwise class loss considerations and maximizes the minimum
margin. Adhering to this concept, we embrace a new formulation that imparts
heightened flexibility to multi-class SVM. Furthermore, the correlations
between the proposed method and multiple forms of multi-class SVM are analyzed.
The proposed regularizer, akin to the concept of "margin", can serve as a
seamless enhancement over the softmax in deep learning, providing guidance for
network parameter learning. Empirical evaluations demonstrate the effectiveness
and superiority of our proposed method over existing multi-classification
methods.Code is available at https://github.com/zz-haooo/M3SVM.
Related papers
- Majorization-Minimization for sparse SVMs [46.99165837639182]
Support Vector Machines (SVMs) were introduced for performing binary classification tasks, under a supervised framework, several decades ago.
They often outperform other supervised methods and remain one of the most popular approaches in the machine learning arena.
In this work, we investigate the training of SVMs through a smooth sparse-promoting-regularized squared hinge loss minimization.
arXiv Detail & Related papers (2023-08-31T17:03:16Z) - An alternative to SVM Method for Data Classification [0.0]
Support vector machine (SVM) is a popular kernel method for data classification.
The method suffers from some weaknesses including; time processing, risk of failure of the optimization process for high dimension cases.
In this paper an alternative method is proposed having a similar performance, with a sensitive improvement of the aforementioned shortcomings.
arXiv Detail & Related papers (2023-08-20T14:09:01Z) - Enhancing Pattern Classification in Support Vector Machines through
Matrix Formulation [0.0]
The reliance on vector-based formulations in existing SVM-based models poses limitations regarding flexibility and ease of incorporating additional terms to handle specific challenges.
We introduce a matrix formulation for SVM that effectively addresses these constraints.
Experimental evaluations on multilabel and multiclass datasets demonstrate that Matrix SVM achieves superior time efficacy.
arXiv Detail & Related papers (2023-07-18T15:56:39Z) - Multi-View Class Incremental Learning [57.14644913531313]
Multi-view learning (MVL) has gained great success in integrating information from multiple perspectives of a dataset to improve downstream task performance.
This paper investigates a novel paradigm called multi-view class incremental learning (MVCIL), where a single model incrementally classifies new classes from a continual stream of views.
arXiv Detail & Related papers (2023-06-16T08:13:41Z) - Learning with Multiclass AUC: Theory and Algorithms [141.63211412386283]
Area under the ROC curve (AUC) is a well-known ranking metric for problems such as imbalanced learning and recommender systems.
In this paper, we start an early trial to consider the problem of learning multiclass scoring functions via optimizing multiclass AUC metrics.
arXiv Detail & Related papers (2021-07-28T05:18:10Z) - Estimating Average Treatment Effects with Support Vector Machines [77.34726150561087]
Support vector machine (SVM) is one of the most popular classification algorithms in the machine learning literature.
We adapt SVM as a kernel-based weighting procedure that minimizes the maximum mean discrepancy between the treatment and control groups.
We characterize the bias of causal effect estimation arising from this trade-off, connecting the proposed SVM procedure to the existing kernel balancing methods.
arXiv Detail & Related papers (2021-02-23T20:22:56Z) - A fast learning algorithm for One-Class Slab Support Vector Machines [1.1613446814180841]
This paper proposes fast training method for One Class Slab SVMs using an updated Sequential Minimal Optimization (SMO)
The results indicate that this training method scales better to large sets of training data than other Quadratic Programming (QP) solvers.
arXiv Detail & Related papers (2020-11-06T09:16:39Z) - Embedded Deep Bilinear Interactive Information and Selective Fusion for
Multi-view Learning [70.67092105994598]
We propose a novel multi-view learning framework to make the multi-view classification better aimed at the above-mentioned two aspects.
In particular, we train different deep neural networks to learn various intra-view representations.
Experiments on six publicly available datasets demonstrate the effectiveness of the proposed method.
arXiv Detail & Related papers (2020-07-13T01:13:23Z) - A Unified Framework for Multiclass and Multilabel Support Vector
Machines [6.425654442936364]
We propose a straightforward extension to the SVM to cope with multiclass and multilabel classification problems.
Our framework deviates from the conventional soft margin SVM framework with its direct oppositional structure.
Results demonstrate a competitive classifier for both multiclass and multilabel classification problems.
arXiv Detail & Related papers (2020-03-25T03:08:41Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.