Active Hybrid Classification
- URL: http://arxiv.org/abs/2101.08854v1
- Date: Thu, 21 Jan 2021 21:09:07 GMT
- Title: Active Hybrid Classification
- Authors: Evgeny Krivosheev, Fabio Casati, Alessandro Bozzon
- Abstract summary: This paper shows how crowd and machines can support each other in tackling classification problems.
We propose an architecture that orchestrates active learning and crowd classification and combines them in a virtuous cycle.
- Score: 79.02441914023811
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Hybrid crowd-machine classifiers can achieve superior performance by
combining the cost-effectiveness of automatic classification with the accuracy
of human judgment. This paper shows how crowd and machines can support each
other in tackling classification problems. Specifically, we propose an
architecture that orchestrates active learning and crowd classification and
combines them in a virtuous cycle. We show that when the pool of items to
classify is finite we face learning vs. exploitation trade-off in hybrid
classification, as we need to balance crowd tasks optimized for creating a
training dataset with tasks optimized for classifying items in the pool. We
define the problem, propose a set of heuristics and evaluate the approach on
three real-world datasets with different characteristics in terms of machine
and crowd classification performance, showing that our active hybrid approach
significantly outperforms baselines.
Related papers
- Class-Aware Contrastive Optimization for Imbalanced Text Classification [19.537124894139833]
We show that leveraging class-aware contrastive optimization combined with denoising autoencoders can successfully tackle imbalanced text classification tasks.
Our proposal demonstrates a notable increase in performance across a wide variety of text datasets.
arXiv Detail & Related papers (2024-10-29T16:34:08Z) - Class-attribute Priors: Adapting Optimization to Heterogeneity and
Fairness Objective [54.33066660817495]
Modern classification problems exhibit heterogeneities across individual classes.
We propose CAP: An effective and general method that generates a class-specific learning strategy.
We show that CAP is competitive with prior art and its flexibility unlocks clear benefits for fairness objectives beyond balanced accuracy.
arXiv Detail & Related papers (2024-01-25T17:43:39Z) - Adversarial AutoMixup [50.1874436169571]
We propose AdAutomixup, an adversarial automatic mixup augmentation approach.
It generates challenging samples to train a robust classifier for image classification.
Our approach outperforms the state of the art in various classification scenarios.
arXiv Detail & Related papers (2023-12-19T08:55:00Z) - Anomaly Detection using Ensemble Classification and Evidence Theory [62.997667081978825]
We present a novel approach for novel detection using ensemble classification and evidence theory.
A pool selection strategy is presented to build a solid ensemble classifier.
We use uncertainty for the anomaly detection approach.
arXiv Detail & Related papers (2022-12-23T00:50:41Z) - Fair and Optimal Classification via Post-Processing [10.163721748735801]
This paper provides a complete characterization of the inherent tradeoff of demographic parity on classification problems.
We show that the minimum error rate achievable by randomized and attribute-aware fair classifiers is given by the optimal value of a Wasserstein-barycenter problem.
arXiv Detail & Related papers (2022-11-03T00:04:04Z) - Association Graph Learning for Multi-Task Classification with Category
Shifts [68.58829338426712]
We focus on multi-task classification, where related classification tasks share the same label space and are learned simultaneously.
We learn an association graph to transfer knowledge among tasks for missing classes.
Our method consistently performs better than representative baselines.
arXiv Detail & Related papers (2022-10-10T12:37:41Z) - Noise-Resilient Ensemble Learning using Evidence Accumulation Clustering [1.7188280334580195]
Ensemble learning methods combine multiple algorithms performing the same task to build a group with superior quality.
These systems are well adapted to the distributed setup, where each peer or machine of the network hosts one algorithm and communicate its results to its peers.
However, the network can be corrupted, altering the prediction accuracy of a peer, which has a deleterious effect on the ensemble quality.
We propose a noise-resilient ensemble classification method, which helps to improve accuracy and correct random errors.
arXiv Detail & Related papers (2021-10-18T11:52:45Z) - CAC: A Clustering Based Framework for Classification [20.372627144885158]
We design a simple, efficient, and generic framework called Classification Aware Clustering (CAC)
Our experiments on synthetic and real benchmark datasets demonstrate the efficacy of CAC over previous methods for combined clustering and classification.
arXiv Detail & Related papers (2021-02-23T18:59:39Z) - Network Classifiers Based on Social Learning [71.86764107527812]
We propose a new way of combining independently trained classifiers over space and time.
The proposed architecture is able to improve prediction performance over time with unlabeled data.
We show that this strategy results in consistent learning with high probability, and it yields a robust structure against poorly trained classifiers.
arXiv Detail & Related papers (2020-10-23T11:18:20Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.