Building an Ensemble of Classifiers via Randomized Models of Ensemble
Members
- URL: http://arxiv.org/abs/2109.07861v1
- Date: Thu, 16 Sep 2021 10:53:13 GMT
- Title: Building an Ensemble of Classifiers via Randomized Models of Ensemble
Members
- Authors: Pawel Trajdos, Marek Kurzynski
- Abstract summary: In this paper, a novel randomized model of base classifier is developed.
In the proposed method, the random operation of the model results from a random selection of the learning set from the family of learning sets of a fixed size.
The DES scheme with the proposed model of competence was experimentally evaluated on the collection of 67 benchmark datasets.
- Score: 1.827510863075184
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Many dynamic ensemble selection (DES) methods are known in the literature. A
previously-developed by the authors, method consists in building a randomized
classifier which is treated as a model of the base classifier. The model is
equivalent to the base classifier in a certain probabilistic sense. Next, the
probability of correct classification of randomized classifier is taken as the
competence of the evaluated classifier.
In this paper, a novel randomized model of base classifier is developed. In
the proposed method, the random operation of the model results from a random
selection of the learning set from the family of learning sets of a fixed size.
The paper presents the mathematical foundations of this approach and shows how,
for a practical application when learning and validation sets are given, one
can determine the measure of competence and build a MC system with the DES
scheme.
The DES scheme with the proposed model of competence was experimentally
evaluated on the collection of 67 benchmark datasets and compared in terms of
eight quality criteria with two ensemble classifiers which use the
previously-proposed concepts of randomized model. The proposed approach
achieved the lowest ranks for almost all investigated quality criteria.
Related papers
- Anomaly Detection using Ensemble Classification and Evidence Theory [62.997667081978825]
We present a novel approach for novel detection using ensemble classification and evidence theory.
A pool selection strategy is presented to build a solid ensemble classifier.
We use uncertainty for the anomaly detection approach.
arXiv Detail & Related papers (2022-12-23T00:50:41Z) - clusterBMA: Bayesian model averaging for clustering [1.2021605201770345]
We introduce clusterBMA, a method that enables weighted model averaging across results from unsupervised clustering algorithms.
We use clustering internal validation criteria to develop an approximation of the posterior model probability, used for weighting the results from each model.
In addition to outperforming other ensemble clustering methods on simulated data, clusterBMA offers unique features including probabilistic allocation to averaged clusters.
arXiv Detail & Related papers (2022-09-09T04:55:20Z) - BRIO: Bringing Order to Abstractive Summarization [107.97378285293507]
We propose a novel training paradigm which assumes a non-deterministic distribution.
Our method achieves a new state-of-the-art result on the CNN/DailyMail (47.78 ROUGE-1) and XSum (49.07 ROUGE-1) datasets.
arXiv Detail & Related papers (2022-03-31T05:19:38Z) - Self-Certifying Classification by Linearized Deep Assignment [65.0100925582087]
We propose a novel class of deep predictors for classifying metric data on graphs within PAC-Bayes risk certification paradigm.
Building on the recent PAC-Bayes literature and data-dependent priors, this approach enables learning posterior distributions on the hypothesis space.
arXiv Detail & Related papers (2022-01-26T19:59:14Z) - When in Doubt: Improving Classification Performance with Alternating
Normalization [57.39356691967766]
We introduce Classification with Alternating Normalization (CAN), a non-parametric post-processing step for classification.
CAN improves classification accuracy for challenging examples by re-adjusting their predicted class probability distribution.
We empirically demonstrate its effectiveness across a diverse set of classification tasks.
arXiv Detail & Related papers (2021-09-28T02:55:42Z) - Meta-Model Structure Selection: Building Polynomial NARX Model for
Regression and Classification [0.0]
This work presents a new meta-heuristic approach to select the structure of NARX models for regression and classification problems.
The robustness of the new algorithm is tested on several simulated and experimental system with different nonlinear characteristics.
arXiv Detail & Related papers (2021-09-21T02:05:40Z) - Probability-driven scoring functions in combining linear classifiers [0.913755431537592]
This research is aimed at building a new fusion method dedicated to the ensemble of linear classifiers.
The proposed fusion method is compared with the reference method using multiple benchmark datasets taken from the KEEL repository.
The experimental study shows that, under certain conditions, some improvement may be obtained.
arXiv Detail & Related papers (2021-09-16T08:58:32Z) - Community Detection in the Stochastic Block Model by Mixed Integer
Programming [3.8073142980733]
Degree-Corrected Block Model (DCSBM) is a popular model to generate random graphs with community structure given an expected degree sequence.
Standard approach of community detection based on the DCSBM is to search for the model parameters that are the most likely to have produced the observed network data through maximum likelihood estimation (MLE)
We present mathematical programming formulations and exact solution methods that can provably find the model parameters and community assignments of maximum likelihood given an observed graph.
arXiv Detail & Related papers (2021-01-26T22:04:40Z) - Control as Hybrid Inference [62.997667081978825]
We present an implementation of CHI which naturally mediates the balance between iterative and amortised inference.
We verify the scalability of our algorithm on a continuous control benchmark, demonstrating that it outperforms strong model-free and model-based baselines.
arXiv Detail & Related papers (2020-07-11T19:44:09Z) - Semi-nonparametric Latent Class Choice Model with a Flexible Class
Membership Component: A Mixture Model Approach [6.509758931804479]
The proposed model formulates the latent classes using mixture models as an alternative approach to the traditional random utility specification.
Results show that mixture models improve the overall performance of latent class choice models.
arXiv Detail & Related papers (2020-07-06T13:19:26Z) - Selective Inference for Latent Block Models [50.83356836818667]
This study provides a selective inference method for latent block models.
We construct a statistical test on a set of row and column cluster memberships of a latent block model.
The proposed exact and approximated tests work effectively, compared to the naive test that did not take the selective bias into account.
arXiv Detail & Related papers (2020-05-27T10:44:19Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.