Kernel-Free Universum Quadratic Surface Twin Support Vector Machines for Imbalanced Data
- URL: http://arxiv.org/abs/2412.01936v1
- Date: Mon, 02 Dec 2024 19:57:59 GMT
- Title: Kernel-Free Universum Quadratic Surface Twin Support Vector Machines for Imbalanced Data
- Authors: Hossein Moosaei, Milan HladÃk, Ahmad Mousavi, Zheming Gao, Haojie Fu,
- Abstract summary: Binary classification tasks with imbalanced classes pose significant challenges in machine learning.
We introduce a novel approach to tackle this issue by leveraging Universum points to support the minority class within quadratic twin support vector machine models.
By incorporating Universum points, our approach enhances classification accuracy and generalization performance on imbalanced datasets.
- Score: 1.8990839669542954
- License:
- Abstract: Binary classification tasks with imbalanced classes pose significant challenges in machine learning. Traditional classifiers often struggle to accurately capture the characteristics of the minority class, resulting in biased models with subpar predictive performance. In this paper, we introduce a novel approach to tackle this issue by leveraging Universum points to support the minority class within quadratic twin support vector machine models. Unlike traditional classifiers, our models utilize quadratic surfaces instead of hyperplanes for binary classification, providing greater flexibility in modeling complex decision boundaries. By incorporating Universum points, our approach enhances classification accuracy and generalization performance on imbalanced datasets. We generated four artificial datasets to demonstrate the flexibility of the proposed methods. Additionally, we validated the effectiveness of our approach through empirical evaluations on benchmark datasets, showing superior performance compared to conventional classifiers and existing methods for imbalanced classification.
Related papers
- Synthetic Tabular Data Generation for Imbalanced Classification: The Surprising Effectiveness of an Overlap Class [20.606333546028516]
We show that state-of-the-art deep generative models yield significantly lower-quality minority examples than majority examples.
We propose a novel technique of converting the binary class labels to ternary class labels by introducing a class for the region where minority and majority distributions overlap.
arXiv Detail & Related papers (2024-12-20T08:15:20Z) - Statistical Undersampling with Mutual Information and Support Points [4.118796935183671]
Class imbalance and distributional differences in large datasets present significant challenges for classification tasks machine learning.
This work introduces two novel undersampling approaches: mutual information-based stratified simple random sampling and support points optimization.
arXiv Detail & Related papers (2024-12-19T04:48:29Z) - Granular Ball K-Class Twin Support Vector Classifier [5.543867614999908]
Granular Ball K-Class Twin Support Vector (GB-TWKSVC)
GB-TWKSVC is a novel multi-class classification framework that combines Twin Support Vector Machines with granular ball computing.
Results demonstrate GB-TWKSVC's broad applicability across domains including pattern recognition, fault diagnosis, and large-scale data analytics.
arXiv Detail & Related papers (2024-12-06T21:47:49Z) - Class-Imbalanced Graph Learning without Class Rebalancing [62.1368829847041]
Class imbalance is prevalent in real-world node classification tasks and poses great challenges for graph learning models.
In this work, we approach the root cause of class-imbalance bias from an topological paradigm.
We devise a lightweight topological augmentation framework BAT to mitigate the class-imbalance bias without class rebalancing.
arXiv Detail & Related papers (2023-08-27T19:01:29Z) - Learning Optimal Fair Scoring Systems for Multi-Class Classification [0.0]
There are growing concerns about Machine Learning models with respect to their lack of interpretability and the undesirable biases they can generate or reproduce.
In this paper, we use Mixed-Integer Linear Programming (MILP) techniques to produce inherently interpretable scoring systems under sparsity and fairness constraints.
arXiv Detail & Related papers (2023-04-11T07:18:04Z) - Constructing Balance from Imbalance for Long-tailed Image Recognition [50.6210415377178]
The imbalance between majority (head) classes and minority (tail) classes severely skews the data-driven deep neural networks.
Previous methods tackle with data imbalance from the viewpoints of data distribution, feature space, and model design.
We propose a concise paradigm by progressively adjusting label space and dividing the head classes and tail classes.
Our proposed model also provides a feature evaluation method and paves the way for long-tailed feature learning.
arXiv Detail & Related papers (2022-08-04T10:22:24Z) - Ensemble Classifier Design Tuned to Dataset Characteristics for Network
Intrusion Detection [0.0]
Two new algorithms are proposed to address the class overlap issue in the dataset.
The proposed design is evaluated for both binary and multi-category classification.
arXiv Detail & Related papers (2022-05-08T21:06:42Z) - Prototypical Classifier for Robust Class-Imbalanced Learning [64.96088324684683]
We propose textitPrototypical, which does not require fitting additional parameters given the embedding network.
Prototypical produces balanced and comparable predictions for all classes even though the training set is class-imbalanced.
We test our method on CIFAR-10LT, CIFAR-100LT and Webvision datasets, observing that Prototypical obtains substaintial improvements compared with state of the arts.
arXiv Detail & Related papers (2021-10-22T01:55:01Z) - Active Hybrid Classification [79.02441914023811]
This paper shows how crowd and machines can support each other in tackling classification problems.
We propose an architecture that orchestrates active learning and crowd classification and combines them in a virtuous cycle.
arXiv Detail & Related papers (2021-01-21T21:09:07Z) - Learning and Evaluating Representations for Deep One-class
Classification [59.095144932794646]
We present a two-stage framework for deep one-class classification.
We first learn self-supervised representations from one-class data, and then build one-class classifiers on learned representations.
In experiments, we demonstrate state-of-the-art performance on visual domain one-class classification benchmarks.
arXiv Detail & Related papers (2020-11-04T23:33:41Z) - Long-Tailed Recognition Using Class-Balanced Experts [128.73438243408393]
We propose an ensemble of class-balanced experts that combines the strength of diverse classifiers.
Our ensemble of class-balanced experts reaches results close to state-of-the-art and an extended ensemble establishes a new state-of-the-art on two benchmarks for long-tailed recognition.
arXiv Detail & Related papers (2020-04-07T20:57:44Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.