Optimal Linear Combination of Classifiers
- URL: http://arxiv.org/abs/2103.01109v1
- Date: Mon, 1 Mar 2021 16:21:40 GMT
- Title: Optimal Linear Combination of Classifiers
- Authors: Georgi Nalbantov, Svetoslav Ivanov
- Abstract summary: The question of whether to use one classifier or a combination of classifiers is a central topic in Machine Learning.
We propose here a method for finding an optimal linear combination of classifiers derived from a bias-variance framework for the classification task.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The question of whether to use one classifier or a combination of classifiers
is a central topic in Machine Learning. We propose here a method for finding an
optimal linear combination of classifiers derived from a bias-variance
framework for the classification task.
Related papers
- Anomaly Detection using Ensemble Classification and Evidence Theory [62.997667081978825]
We present a novel approach for novel detection using ensemble classification and evidence theory.
A pool selection strategy is presented to build a solid ensemble classifier.
We use uncertainty for the anomaly detection approach.
arXiv Detail & Related papers (2022-12-23T00:50:41Z) - On the rate of convergence of a classifier based on a Transformer
encoder [55.41148606254641]
The rate of convergence of the misclassification probability of the classifier towards the optimal misclassification probability is analyzed.
It is shown that this classifier is able to circumvent the curse of dimensionality provided the aposteriori probability satisfies a suitable hierarchical composition model.
arXiv Detail & Related papers (2021-11-29T14:58:29Z) - When in Doubt: Improving Classification Performance with Alternating
Normalization [57.39356691967766]
We introduce Classification with Alternating Normalization (CAN), a non-parametric post-processing step for classification.
CAN improves classification accuracy for challenging examples by re-adjusting their predicted class probability distribution.
We empirically demonstrate its effectiveness across a diverse set of classification tasks.
arXiv Detail & Related papers (2021-09-28T02:55:42Z) - Probability-driven scoring functions in combining linear classifiers [0.913755431537592]
This research is aimed at building a new fusion method dedicated to the ensemble of linear classifiers.
The proposed fusion method is compared with the reference method using multiple benchmark datasets taken from the KEEL repository.
The experimental study shows that, under certain conditions, some improvement may be obtained.
arXiv Detail & Related papers (2021-09-16T08:58:32Z) - Relearning ensemble selection based on new generated features [0.0]
The proposed technique was compared with state-of-the-art ensemble methods using three benchmark datasets and one synthetic dataset.
Four classification performance measures are used to evaluate the proposed method.
arXiv Detail & Related papers (2021-06-12T12:45:32Z) - CAC: A Clustering Based Framework for Classification [20.372627144885158]
We design a simple, efficient, and generic framework called Classification Aware Clustering (CAC)
Our experiments on synthetic and real benchmark datasets demonstrate the efficacy of CAC over previous methods for combined clustering and classification.
arXiv Detail & Related papers (2021-02-23T18:59:39Z) - Binary Classification from Multiple Unlabeled Datasets via Surrogate Set
Classification [94.55805516167369]
We propose a new approach for binary classification from m U-sets for $mge2$.
Our key idea is to consider an auxiliary classification task called surrogate set classification (SSC)
arXiv Detail & Related papers (2021-02-01T07:36:38Z) - Active Hybrid Classification [79.02441914023811]
This paper shows how crowd and machines can support each other in tackling classification problems.
We propose an architecture that orchestrates active learning and crowd classification and combines them in a virtuous cycle.
arXiv Detail & Related papers (2021-01-21T21:09:07Z) - A Multiple Classifier Approach for Concatenate-Designed Neural Networks [13.017053017670467]
We give the design of the classifiers, which collects the features produced between the network sets.
We use the L2 normalization method to obtain the classification score instead of the Softmax Dense.
As a result, the proposed classifiers are able to improve the accuracy in the experimental cases.
arXiv Detail & Related papers (2021-01-14T04:32:40Z) - Ensemble of Binary Classifiers Combined Using Recurrent Correlation
Associative Memories [1.3706331473063877]
The majority vote is an example of a methodology used to combine classifiers in an ensemble method.
We introduce ensemble methods based on recurrent correlation associative memories for binary classification problems.
arXiv Detail & Related papers (2020-09-18T01:16:53Z) - High-Dimensional Quadratic Discriminant Analysis under Spiked Covariance
Model [101.74172837046382]
We propose a novel quadratic classification technique, the parameters of which are chosen such that the fisher-discriminant ratio is maximized.
Numerical simulations show that the proposed classifier not only outperforms the classical R-QDA for both synthetic and real data but also requires lower computational complexity.
arXiv Detail & Related papers (2020-06-25T12:00:26Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.