Classifier Chains: A Review and Perspectives
- URL: http://arxiv.org/abs/1912.13405v2
- Date: Wed, 15 Apr 2020 11:36:27 GMT
- Title: Classifier Chains: A Review and Perspectives
- Authors: Jesse Read, Bernhard Pfahringer, Geoff Holmes, Eibe Frank
- Abstract summary: The family of methods collectively known as classifier chains has become a popular approach to multi-label learning problems.
This work provides a review of the techniques and extensions provided in the literature, as well as perspectives for this approach in the domain of multi-label classification in the future.
- Score: 3.752624871808558
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The family of methods collectively known as classifier chains has become a
popular approach to multi-label learning problems. This approach involves
linking together off-the-shelf binary classifiers in a chain structure, such
that class label predictions become features for other classifiers. Such
methods have proved flexible and effective and have obtained state-of-the-art
empirical performance across many datasets and multi-label evaluation metrics.
This performance led to further studies of how exactly it works, and how it
could be improved, and in the recent decade numerous studies have explored
classifier chains mechanisms on a theoretical level, and many improvements have
been made to the training and inference procedures, such that this method
remains among the state-of-the-art options for multi-label learning. Given this
past and ongoing interest, which covers a broad range of applications and
research themes, the goal of this work is to provide a review of classifier
chains, a survey of the techniques and extensions provided in the literature,
as well as perspectives for this approach in the domain of multi-label
classification in the future. We conclude positively, with a number of
recommendations for researchers and practitioners, as well as outlining a
number of areas for future research.
Related papers
- Estimating class separability of text embeddings with persistent homology [1.9956517534421363]
This paper introduces an unsupervised method to estimate the class separability of text datasets from a topological point of view.
We show how this technique can be applied to detect when the training process stops improving the separability of the embeddings.
Our results, validated across binary and multi-class text classification tasks, show that the proposed method's estimates of class separability align with those obtained from supervised methods.
arXiv Detail & Related papers (2023-05-24T10:58:09Z) - Reliable Representations Learning for Incomplete Multi-View Partial Multi-Label Classification [78.15629210659516]
In this paper, we propose an incomplete multi-view partial multi-label classification network named RANK.
We break through the view-level weights inherent in existing methods and propose a quality-aware sub-network to dynamically assign quality scores to each view of each sample.
Our model is not only able to handle complete multi-view multi-label datasets, but also works on datasets with missing instances and labels.
arXiv Detail & Related papers (2023-03-30T03:09:25Z) - Anomaly Detection using Ensemble Classification and Evidence Theory [62.997667081978825]
We present a novel approach for novel detection using ensemble classification and evidence theory.
A pool selection strategy is presented to build a solid ensemble classifier.
We use uncertainty for the anomaly detection approach.
arXiv Detail & Related papers (2022-12-23T00:50:41Z) - Parametric Classification for Generalized Category Discovery: A Baseline
Study [70.73212959385387]
Generalized Category Discovery (GCD) aims to discover novel categories in unlabelled datasets using knowledge learned from labelled samples.
We investigate the failure of parametric classifiers, verify the effectiveness of previous design choices when high-quality supervision is available, and identify unreliable pseudo-labels as a key problem.
We propose a simple yet effective parametric classification method that benefits from entropy regularisation, achieves state-of-the-art performance on multiple GCD benchmarks and shows strong robustness to unknown class numbers.
arXiv Detail & Related papers (2022-11-21T18:47:11Z) - An Evolutionary Approach for Creating of Diverse Classifier Ensembles [11.540822622379176]
We propose a framework for classifier selection and fusion based on a four-step protocol called CIF-E.
We implement and evaluate 24 varied ensemble approaches following the proposed CIF-E protocol.
Experiments show that the proposed evolutionary approach can outperform the state-of-the-art literature approaches in many well-known UCI datasets.
arXiv Detail & Related papers (2022-08-23T14:23:27Z) - A Comprehensive Survey on Deep Clustering: Taxonomy, Challenges, and
Future Directions [48.97008907275482]
Clustering is a fundamental machine learning task which has been widely studied in the literature.
Deep Clustering, i.e., jointly optimizing the representation learning and clustering, has been proposed and hence attracted growing attention in the community.
We summarize the essential components of deep clustering and categorize existing methods by the ways they design interactions between deep representation learning and clustering.
arXiv Detail & Related papers (2022-06-15T15:05:13Z) - A Three-phase Augmented Classifiers Chain Approach Based on
Co-occurrence Analysis for Multi-Label Classification [0.0]
existing Chains methods are difficult to model and exploit the underlying dependency in the label space.
We present a three-phase augmented Chain approach based on co-occurrence analysis for multilabel classification.
arXiv Detail & Related papers (2022-04-13T02:10:14Z) - Self-Training: A Survey [5.772546394254112]
Semi-supervised algorithms aim to learn prediction functions from a small set of labeled observations and a large set of unlabeled observations.
Among the existing techniques, self-training methods have undoubtedly attracted greater attention in recent years.
We present self-training methods for binary and multi-class classification; as well as their variants and two related approaches.
arXiv Detail & Related papers (2022-02-24T11:40:44Z) - Binary Classification from Multiple Unlabeled Datasets via Surrogate Set
Classification [94.55805516167369]
We propose a new approach for binary classification from m U-sets for $mge2$.
Our key idea is to consider an auxiliary classification task called surrogate set classification (SSC)
arXiv Detail & Related papers (2021-02-01T07:36:38Z) - One-Class Classification: A Survey [96.17410674315816]
One-Class Classification (OCC) is a special case of multi-class classification, where data observed during training is from a single positive class.
We provide a survey of classical statistical and recent deep learning-based OCC methods for visual recognition.
arXiv Detail & Related papers (2021-01-08T15:30:29Z) - Neural Networks as Functional Classifiers [0.0]
We extend notable deep learning methodologies to the domain of functional data for the purpose of classification problems.
We highlight the effectiveness of our method in a number of classification applications such as classification of spectrographic data.
arXiv Detail & Related papers (2020-10-09T00:11:01Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.