Probabilistic Classification Vector Machine for Multi-Class
Classification
- URL: http://arxiv.org/abs/2006.15791v1
- Date: Mon, 29 Jun 2020 03:21:38 GMT
- Title: Probabilistic Classification Vector Machine for Multi-Class
Classification
- Authors: Shengfei Lyu, Xing Tian, Yang Li, Bingbing Jiang, Huanhuan Chen
- Abstract summary: The probabilistic classification vector machine (PCVM) synthesizes the advantages of both the support vector machine and the relevant vector machine.
We extend the PCVM to multi-class cases via voting strategies such as one-vs-rest or one-vs-one.
Two learning algorithms, i.e., one top-down algorithm and one bottom-up algorithm, have been implemented in the mPCVM.
The superior performance of the mPCVMs is extensively evaluated on synthetic and benchmark data sets.
- Score: 29.411892651468797
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The probabilistic classification vector machine (PCVM) synthesizes the
advantages of both the support vector machine and the relevant vector machine,
delivering a sparse Bayesian solution to classification problems. However, the
PCVM is currently only applicable to binary cases. Extending the PCVM to
multi-class cases via heuristic voting strategies such as one-vs-rest or
one-vs-one often results in a dilemma where classifiers make contradictory
predictions, and those strategies might lose the benefits of probabilistic
outputs. To overcome this problem, we extend the PCVM and propose a multi-class
probabilistic classification vector machine (mPCVM). Two learning algorithms,
i.e., one top-down algorithm and one bottom-up algorithm, have been implemented
in the mPCVM. The top-down algorithm obtains the maximum a posteriori (MAP)
point estimates of the parameters based on an expectation-maximization
algorithm, and the bottom-up algorithm is an incremental paradigm by maximizing
the marginal likelihood. The superior performance of the mPCVMs, especially
when the investigated problem has a large number of classes, is extensively
evaluated on synthetic and benchmark data sets.
Related papers
- Multi-class Support Vector Machine with Maximizing Minimum Margin [67.51047882637688]
Support Vector Machine (SVM) is a prominent machine learning technique widely applied in pattern recognition tasks.
We propose a novel method for multi-class SVM that incorporates pairwise class loss considerations and maximizes the minimum margin.
Empirical evaluations demonstrate the effectiveness and superiority of our proposed method over existing multi-classification methods.
arXiv Detail & Related papers (2023-12-11T18:09:55Z) - Spectral Entry-wise Matrix Estimation for Low-Rank Reinforcement
Learning [53.445068584013896]
We study matrix estimation problems arising in reinforcement learning (RL) with low-rank structure.
In low-rank bandits, the matrix to be recovered specifies the expected arm rewards, and for low-rank Markov Decision Processes (MDPs), it may for example characterize the transition kernel of the MDP.
We show that simple spectral-based matrix estimation approaches efficiently recover the singular subspaces of the matrix and exhibit nearly-minimal entry-wise error.
arXiv Detail & Related papers (2023-10-10T17:06:41Z) - Cost-sensitive probabilistic predictions for support vector machines [1.743685428161914]
Support vector machines (SVMs) are widely used and constitute one of the best examined and used machine learning models.
We propose a novel approach to generate probabilistic outputs for the SVM.
arXiv Detail & Related papers (2023-10-09T11:00:17Z) - Projection based fuzzy least squares twin support vector machine for
class imbalance problems [0.9668407688201361]
We propose a novel fuzzy based approach to deal with class imbalanced as well noisy datasets.
The proposed algorithms are evaluated on several benchmark and synthetic datasets.
arXiv Detail & Related papers (2023-09-27T14:28:48Z) - Efficient Approximate Kernel Based Spike Sequence Classification [56.2938724367661]
Machine learning models, such as SVM, require a definition of distance/similarity between pairs of sequences.
Exact methods yield better classification performance, but they pose high computational costs.
We propose a series of ways to improve the performance of the approximate kernel in order to enhance its predictive performance.
arXiv Detail & Related papers (2022-09-11T22:44:19Z) - Handling Imbalanced Classification Problems With Support Vector Machines
via Evolutionary Bilevel Optimization [73.17488635491262]
Support vector machines (SVMs) are popular learning algorithms to deal with binary classification problems.
This article introduces EBCS-SVM: evolutionary bilevel cost-sensitive SVMs.
arXiv Detail & Related papers (2022-04-21T16:08:44Z) - Estimating Average Treatment Effects with Support Vector Machines [77.34726150561087]
Support vector machine (SVM) is one of the most popular classification algorithms in the machine learning literature.
We adapt SVM as a kernel-based weighting procedure that minimizes the maximum mean discrepancy between the treatment and control groups.
We characterize the bias of causal effect estimation arising from this trade-off, connecting the proposed SVM procedure to the existing kernel balancing methods.
arXiv Detail & Related papers (2021-02-23T20:22:56Z) - Efficient semidefinite-programming-based inference for binary and
multi-class MRFs [83.09715052229782]
We propose an efficient method for computing the partition function or MAP estimate in a pairwise MRF.
We extend semidefinite relaxations from the typical binary MRF to the full multi-class setting, and develop a compact semidefinite relaxation that can again be solved efficiently using the solver.
arXiv Detail & Related papers (2020-12-04T15:36:29Z) - Fully Bayesian Analysis of the Relevance Vector Machine Classification
for Imbalanced Data [0.0]
This paper proposes a Generic Bayesian approach for the RVM classification.
We conjecture our algorithm achieves convergent estimates of the quantities of interest compared with the nonconvergent estimates of the original RVM classification algorithm.
A Fully Bayesian approach with the hierarchical hyperprior structure for RVM classification is proposed, which improves the classification performance, especially in the imbalanced data problem.
arXiv Detail & Related papers (2020-07-26T14:53:36Z) - Unified SVM Algorithm Based on LS-DC Loss [0.0]
We propose an algorithm that can train different SVM models.
UniSVM has a dominant advantage over all existing algorithms because it has a closed-form solution.
Experiments show that UniSVM can achieve comparable performance in less training time.
arXiv Detail & Related papers (2020-06-16T12:40:06Z) - Nonparallel Hyperplane Classifiers for Multi-category Classification [0.3867363075280544]
Nonparallel hyperplanes classification algorithms (NHCAs) have been proposed, which are comparable in terms of classification accuracy when compared with SVM.
In this paper, we present a comparative study of four NHCAs i.e. Twin SVM (TWSVM), Generalized eigenvalue proximal SVM (GEPSVM), Regularized GEPSVM (RegGEPSVM) and Improved GEPSVM (IGEPSVM) for multi-category classification.
The experimental results show that TDS-TWSVM outperforms other methods in terms of classification accuracy and BT-RegGEPSVM takes
arXiv Detail & Related papers (2020-04-16T08:03:40Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.