Fully Bayesian Analysis of the Relevance Vector Machine Classification
for Imbalanced Data
- URL: http://arxiv.org/abs/2007.13140v2
- Date: Thu, 27 Oct 2022 07:20:17 GMT
- Title: Fully Bayesian Analysis of the Relevance Vector Machine Classification
for Imbalanced Data
- Authors: Wenyang Wang, Dongchu Sun, Zhuoqiong He
- Abstract summary: This paper proposes a Generic Bayesian approach for the RVM classification.
We conjecture our algorithm achieves convergent estimates of the quantities of interest compared with the nonconvergent estimates of the original RVM classification algorithm.
A Fully Bayesian approach with the hierarchical hyperprior structure for RVM classification is proposed, which improves the classification performance, especially in the imbalanced data problem.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Relevance Vector Machine (RVM) is a supervised learning algorithm extended
from Support Vector Machine (SVM) based on the Bayesian sparsity model.
Compared with the regression problem, RVM classification is difficult to be
conducted because there is no closed-form solution for the weight parameter
posterior. Original RVM classification algorithm used Newton's method in
optimization to obtain the mode of weight parameter posterior then approximated
it by a Gaussian distribution in Laplace's method. It would work but just
applied the frequency methods in a Bayesian framework. This paper proposes a
Generic Bayesian approach for the RVM classification. We conjecture that our
algorithm achieves convergent estimates of the quantities of interest compared
with the nonconvergent estimates of the original RVM classification algorithm.
Furthermore, a Fully Bayesian approach with the hierarchical hyperprior
structure for RVM classification is proposed, which improves the classification
performance, especially in the imbalanced data problem. By the numeric studies,
our proposed algorithms obtain high classification accuracy rates. The Fully
Bayesian hierarchical hyperprior method outperforms the Generic one for the
imbalanced data classification.
Related papers
- Cost-sensitive probabilistic predictions for support vector machines [1.743685428161914]
Support vector machines (SVMs) are widely used and constitute one of the best examined and used machine learning models.
We propose a novel approach to generate probabilistic outputs for the SVM.
arXiv Detail & Related papers (2023-10-09T11:00:17Z) - Projection based fuzzy least squares twin support vector machine for
class imbalance problems [0.9668407688201361]
We propose a novel fuzzy based approach to deal with class imbalanced as well noisy datasets.
The proposed algorithms are evaluated on several benchmark and synthetic datasets.
arXiv Detail & Related papers (2023-09-27T14:28:48Z) - Consensus-Adaptive RANSAC [104.87576373187426]
We propose a new RANSAC framework that learns to explore the parameter space by considering the residuals seen so far via a novel attention layer.
The attention mechanism operates on a batch of point-to-model residuals, and updates a per-point estimation state to take into account the consensus found through a lightweight one-step transformer.
arXiv Detail & Related papers (2023-07-26T08:25:46Z) - Compound Batch Normalization for Long-tailed Image Classification [77.42829178064807]
We propose a compound batch normalization method based on a Gaussian mixture.
It can model the feature space more comprehensively and reduce the dominance of head classes.
The proposed method outperforms existing methods on long-tailed image classification.
arXiv Detail & Related papers (2022-12-02T07:31:39Z) - Efficient Approximate Kernel Based Spike Sequence Classification [56.2938724367661]
Machine learning models, such as SVM, require a definition of distance/similarity between pairs of sequences.
Exact methods yield better classification performance, but they pose high computational costs.
We propose a series of ways to improve the performance of the approximate kernel in order to enhance its predictive performance.
arXiv Detail & Related papers (2022-09-11T22:44:19Z) - Handling Imbalanced Classification Problems With Support Vector Machines
via Evolutionary Bilevel Optimization [73.17488635491262]
Support vector machines (SVMs) are popular learning algorithms to deal with binary classification problems.
This article introduces EBCS-SVM: evolutionary bilevel cost-sensitive SVMs.
arXiv Detail & Related papers (2022-04-21T16:08:44Z) - Riemannian classification of EEG signals with missing values [67.90148548467762]
This paper proposes two strategies to handle missing data for the classification of electroencephalograms.
The first approach estimates the covariance from imputed data with the $k$-nearest neighbors algorithm; the second relies on the observed data by leveraging the observed-data likelihood within an expectation-maximization algorithm.
As results show, the proposed strategies perform better than the classification based on observed data and allow to keep a high accuracy even when the missing data ratio increases.
arXiv Detail & Related papers (2021-10-19T14:24:50Z) - Weighted Least Squares Twin Support Vector Machine with Fuzzy Rough Set
Theory for Imbalanced Data Classification [0.483420384410068]
Support vector machines (SVMs) are powerful supervised learning tools developed to solve classification problems.
We propose an approach that efficiently used fuzzy rough set theory in weighted least squares twin support vector machine called FRLSTSVM for classification of imbalanced data.
arXiv Detail & Related papers (2021-05-03T22:33:39Z) - Piecewise linear regression and classification [0.20305676256390928]
This paper proposes a method for solving multivariate regression and classification problems using piecewise linear predictors.
A Python implementation of the algorithm described in this paper is available at http://cse.lab.imtlucca.it/bemporad/parc.
arXiv Detail & Related papers (2021-03-10T17:07:57Z) - Estimating Average Treatment Effects with Support Vector Machines [77.34726150561087]
Support vector machine (SVM) is one of the most popular classification algorithms in the machine learning literature.
We adapt SVM as a kernel-based weighting procedure that minimizes the maximum mean discrepancy between the treatment and control groups.
We characterize the bias of causal effect estimation arising from this trade-off, connecting the proposed SVM procedure to the existing kernel balancing methods.
arXiv Detail & Related papers (2021-02-23T20:22:56Z) - Probabilistic Classification Vector Machine for Multi-Class
Classification [29.411892651468797]
The probabilistic classification vector machine (PCVM) synthesizes the advantages of both the support vector machine and the relevant vector machine.
We extend the PCVM to multi-class cases via voting strategies such as one-vs-rest or one-vs-one.
Two learning algorithms, i.e., one top-down algorithm and one bottom-up algorithm, have been implemented in the mPCVM.
The superior performance of the mPCVMs is extensively evaluated on synthetic and benchmark data sets.
arXiv Detail & Related papers (2020-06-29T03:21:38Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.