A Unified Framework for Multiclass and Multilabel Support Vector
Machines
- URL: http://arxiv.org/abs/2003.11197v1
- Date: Wed, 25 Mar 2020 03:08:41 GMT
- Title: A Unified Framework for Multiclass and Multilabel Support Vector
Machines
- Authors: Hoda Shajari, Anand Rangarajan
- Abstract summary: We propose a straightforward extension to the SVM to cope with multiclass and multilabel classification problems.
Our framework deviates from the conventional soft margin SVM framework with its direct oppositional structure.
Results demonstrate a competitive classifier for both multiclass and multilabel classification problems.
- Score: 6.425654442936364
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We propose a novel integrated formulation for multiclass and multilabel
support vector machines (SVMs). A number of approaches have been proposed to
extend the original binary SVM to an all-in-one multiclass SVM. However, its
direct extension to a unified multilabel SVM has not been widely investigated.
We propose a straightforward extension to the SVM to cope with multiclass and
multilabel classification problems within a unified framework. Our framework
deviates from the conventional soft margin SVM framework with its direct
oppositional structure. In our formulation, class-specific weight vectors
(normal vectors) are learned by maximizing their margin with respect to an
origin and penalizing patterns when they get too close to this origin. As a
result, each weight vector chooses an orientation and a magnitude with respect
to this origin in such a way that it best represents the patterns belonging to
its corresponding class. Opposition between classes is introduced into the
formulation via the minimization of pairwise inner products of weight vectors.
We also extend our framework to cope with nonlinear separability via standard
reproducing kernel Hilbert spaces (RKHS). Biases which are closely related to
the origin need to be treated properly in both the original feature space and
Hilbert space. We have the flexibility to incorporate constraints into the
formulation (if they better reflect the underlying geometry) and improve the
performance of the classifier. To this end, specifics and technicalities such
as the origin in RKHS are addressed. Results demonstrates a competitive
classifier for both multiclass and multilabel classification problems.
Related papers
- MOKD: Cross-domain Finetuning for Few-shot Classification via Maximizing Optimized Kernel Dependence [97.93517982908007]
In cross-domain few-shot classification, NCC aims to learn representations to construct a metric space where few-shot classification can be performed.
In this paper, we find that there exist high similarities between NCC-learned representations of two samples from different classes.
We propose a bi-level optimization framework, emphmaximizing optimized kernel dependence (MOKD) to learn a set of class-specific representations that match the cluster structures indicated by labeled data.
arXiv Detail & Related papers (2024-05-29T05:59:52Z) - Tverberg's theorem and multi-class support vector machines [0.0]
We show how we can design new models of multi-class support vector machines (SVMs)
These protocols require fewer conditions to classify sets of points, and can be computed using existing binary SVM algorithms in higher-dimensional spaces.
We give a new simple proof of a geometric characterization of support vectors for largest margin SVMs by Veelaert.
arXiv Detail & Related papers (2024-04-25T16:37:58Z) - Multi-class Support Vector Machine with Maximizing Minimum Margin [67.51047882637688]
Support Vector Machine (SVM) is a prominent machine learning technique widely applied in pattern recognition tasks.
We propose a novel method for multi-class SVM that incorporates pairwise class loss considerations and maximizes the minimum margin.
Empirical evaluations demonstrate the effectiveness and superiority of our proposed method over existing multi-classification methods.
arXiv Detail & Related papers (2023-12-11T18:09:55Z) - Handling Imbalanced Classification Problems With Support Vector Machines
via Evolutionary Bilevel Optimization [73.17488635491262]
Support vector machines (SVMs) are popular learning algorithms to deal with binary classification problems.
This article introduces EBCS-SVM: evolutionary bilevel cost-sensitive SVMs.
arXiv Detail & Related papers (2022-04-21T16:08:44Z) - Generalization Error Bounds for Multiclass Sparse Linear Classifiers [7.360807642941714]
We consider high-dimensional multiclass classification by sparse multinomial logistic regression.
We propose a computationally feasible feature selection procedure based on penalized maximum likelihood.
In particular, we consider global sparsity, double row-wise sparsity, and low-rank sparsity.
arXiv Detail & Related papers (2022-04-13T09:25:03Z) - High-Dimensional Sparse Bayesian Learning without Covariance Matrices [66.60078365202867]
We introduce a new inference scheme that avoids explicit construction of the covariance matrix.
Our approach couples a little-known diagonal estimation result from numerical linear algebra with the conjugate gradient algorithm.
On several simulations, our method scales better than existing approaches in computation time and memory.
arXiv Detail & Related papers (2022-02-25T16:35:26Z) - Gated recurrent units and temporal convolutional network for multilabel
classification [122.84638446560663]
This work proposes a new ensemble method for managing multilabel classification.
The core of the proposed approach combines a set of gated recurrent units and temporal convolutional neural networks trained with variants of the Adam gradients optimization approach.
arXiv Detail & Related papers (2021-10-09T00:00:16Z) - Frame Averaging for Invariant and Equivariant Network Design [50.87023773850824]
We introduce Frame Averaging (FA), a framework for adapting known (backbone) architectures to become invariant or equivariant to new symmetry types.
We show that FA-based models have maximal expressive power in a broad setting.
We propose a new class of universal Graph Neural Networks (GNNs), universal Euclidean motion invariant point cloud networks, and Euclidean motion invariant Message Passing (MP) GNNs.
arXiv Detail & Related papers (2021-10-07T11:05:23Z) - A fast learning algorithm for One-Class Slab Support Vector Machines [1.1613446814180841]
This paper proposes fast training method for One Class Slab SVMs using an updated Sequential Minimal Optimization (SMO)
The results indicate that this training method scales better to large sets of training data than other Quadratic Programming (QP) solvers.
arXiv Detail & Related papers (2020-11-06T09:16:39Z) - Nonlinear classifiers for ranking problems based on kernelized SVM [0.0]
Many classification problems focus on maximizing the performance only on the samples with the highest relevance instead of all samples.
In this paper, we derive a general framework including several classes of these linear classification problems.
We dualize the problems, add kernels and propose a componentwise dual ascent method.
arXiv Detail & Related papers (2020-02-26T12:37:11Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.