Tverberg's theorem and multi-class support vector machines
- URL: http://arxiv.org/abs/2404.16724v1
- Date: Thu, 25 Apr 2024 16:37:58 GMT
- Title: Tverberg's theorem and multi-class support vector machines
- Authors: Pablo SoberĂ³n,
- Abstract summary: We show how we can design new models of multi-class support vector machines (SVMs)
These protocols require fewer conditions to classify sets of points, and can be computed using existing binary SVM algorithms in higher-dimensional spaces.
We give a new simple proof of a geometric characterization of support vectors for largest margin SVMs by Veelaert.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We show how, using linear-algebraic tools developed to prove Tverberg's theorem in combinatorial geometry, we can design new models of multi-class support vector machines (SVMs). These supervised learning protocols require fewer conditions to classify sets of points, and can be computed using existing binary SVM algorithms in higher-dimensional spaces, including soft-margin SVM algorithms. We describe how the theoretical guarantees of standard support vector machines transfer to these new classes of multi-class support vector machines. We give a new simple proof of a geometric characterization of support vectors for largest margin SVMs by Veelaert.
Related papers
- Multi-class Support Vector Machine with Maximizing Minimum Margin [67.51047882637688]
Support Vector Machine (SVM) is a prominent machine learning technique widely applied in pattern recognition tasks.
We propose a novel method for multi-class SVM that incorporates pairwise class loss considerations and maximizes the minimum margin.
Empirical evaluations demonstrate the effectiveness and superiority of our proposed method over existing multi-classification methods.
arXiv Detail & Related papers (2023-12-11T18:09:55Z) - Low-Rank Multitask Learning based on Tensorized SVMs and LSSVMs [65.42104819071444]
Multitask learning (MTL) leverages task-relatedness to enhance performance.
We employ high-order tensors, with each mode corresponding to a task index, to naturally represent tasks referenced by multiple indices.
We propose a general framework of low-rank MTL methods with tensorized support vector machines (SVMs) and least square support vector machines (LSSVMs)
arXiv Detail & Related papers (2023-08-30T14:28:26Z) - New Equivalences Between Interpolation and SVMs: Kernels and Structured
Features [22.231455330003328]
We present a new and flexible analysis framework for proving SVP in an arbitrary kernel reproducing Hilbert space with a flexible class of generative models for the labels.
We show that SVP occurs in many interesting settings not covered by prior work, and we leverage these results to prove novel generalization results for kernel SVM classification.
arXiv Detail & Related papers (2023-05-03T17:52:40Z) - Handling Imbalanced Classification Problems With Support Vector Machines
via Evolutionary Bilevel Optimization [73.17488635491262]
Support vector machines (SVMs) are popular learning algorithms to deal with binary classification problems.
This article introduces EBCS-SVM: evolutionary bilevel cost-sensitive SVMs.
arXiv Detail & Related papers (2022-04-21T16:08:44Z) - High-Dimensional Sparse Bayesian Learning without Covariance Matrices [66.60078365202867]
We introduce a new inference scheme that avoids explicit construction of the covariance matrix.
Our approach couples a little-known diagonal estimation result from numerical linear algebra with the conjugate gradient algorithm.
On several simulations, our method scales better than existing approaches in computation time and memory.
arXiv Detail & Related papers (2022-02-25T16:35:26Z) - Understanding Hyperdimensional Computing for Parallel Single-Pass
Learning [47.82940409267635]
We show that HDC can outperform the state-of-the-art HDC model by up to 7.6% while maintaining hardware efficiency.
We propose a new class of VSAs, finite group VSAs, which surpass the limits of HDC.
Experimental results show that our RFF method and group VSA can both outperform the state-of-the-art HDC model by up to 7.6%.
arXiv Detail & Related papers (2022-02-10T02:38:56Z) - Learning in High-Dimensional Feature Spaces Using ANOVA-Based Fast
Matrix-Vector Multiplication [0.0]
kernel matrix is typically dense and large-scale. Depending on the dimension of the feature space even the computation of all of its entries in reasonable time becomes a challenging task.
We propose the use of an ANOVA kernel, where we construct several kernels based on lower-dimensional feature spaces for which we provide fast algorithms realizing the matrix-vector products.
Based on a feature grouping approach, we then show how the fast matrix-vector products can be embedded into a learning method choosing kernel ridge regression and the preconditioned conjugate gradient solver.
arXiv Detail & Related papers (2021-11-19T10:29:39Z) - A Differential Geometry Perspective on Orthogonal Recurrent Models [56.09491978954866]
We employ tools and insights from differential geometry to offer a novel perspective on orthogonal RNNs.
We show that orthogonal RNNs may be viewed as optimizing in the space of divergence-free vector fields.
Motivated by this observation, we study a new recurrent model, which spans the entire space of vector fields.
arXiv Detail & Related papers (2021-02-18T19:39:22Z) - On the proliferation of support vectors in high dimensions [24.63581896788434]
The support vector machine (SVM) is a well-established classification method whose name refers to the particular training examples, called support vectors.
Recent research has shown that in sufficiently high-dimensional linear classification problems, the SVM can generalize well despite a proliferation of support vectors.
arXiv Detail & Related papers (2020-09-22T16:45:06Z) - MLPSVM:A new parallel support vector machine to multi-label learning [2.370531727442524]
This paper proposes a multi-label learning algorithm that can also be used for single-label classification.
It is based on standard support vector machines and changes the original single decision hyperplane into two parallel decision hyper-planes, which call multi-label parallel support vector machine (MLPSVM)
arXiv Detail & Related papers (2020-04-13T10:04:25Z) - A Unified Framework for Multiclass and Multilabel Support Vector
Machines [6.425654442936364]
We propose a straightforward extension to the SVM to cope with multiclass and multilabel classification problems.
Our framework deviates from the conventional soft margin SVM framework with its direct oppositional structure.
Results demonstrate a competitive classifier for both multiclass and multilabel classification problems.
arXiv Detail & Related papers (2020-03-25T03:08:41Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.