List Online Classification
- URL: http://arxiv.org/abs/2303.15383v3
- Date: Thu, 18 May 2023 12:12:40 GMT
- Title: List Online Classification
- Authors: Shay Moran, Ohad Sharon, Iska Tsubari, Sivan Yosebashvili
- Abstract summary: We study multiclass online prediction where the learner can predict using a list of multiple labels.
We characterize learnability in this model using the $b$-ary Littlestone dimension.
As part of our work, we adapt classical algorithms such as Littlestone's SOA and Rosenblatt's Perceptron to predict using lists of labels.
- Score: 13.358536176734999
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We study multiclass online prediction where the learner can predict using a
list of multiple labels (as opposed to just one label in the traditional
setting). We characterize learnability in this model using the $b$-ary
Littlestone dimension. This dimension is a variation of the classical
Littlestone dimension with the difference that binary mistake trees are
replaced with $(k+1)$-ary mistake trees, where $k$ is the number of labels in
the list. In the agnostic setting, we explore different scenarios depending on
whether the comparator class consists of single-labeled or multi-labeled
functions and its tradeoff with the size of the lists the algorithm uses. We
find that it is possible to achieve negative regret in some cases and provide a
complete characterization of when this is possible. As part of our work, we
adapt classical algorithms such as Littlestone's SOA and Rosenblatt's
Perceptron to predict using lists of labels. We also establish combinatorial
results for list-learnable classes, including an list online version of the
Sauer-Shelah-Perles Lemma. We state our results within the framework of pattern
classes -- a generalization of hypothesis classes which can represent adaptive
hypotheses (i.e. functions with memory), and model data-dependent assumptions
such as linear classification with margin.
Related papers
- A Characterization of List Regression [5.371337604556312]
We provide a complete characterization of list regression.
We propose two dimensions, namely the $k$-OIG dimension and the $k$-fat-shattering dimension, and show that they optimally characterize realizable and $k$-list regression respectively.
arXiv Detail & Related papers (2024-09-28T03:19:37Z) - Multi-Label Knowledge Distillation [86.03990467785312]
We propose a novel multi-label knowledge distillation method.
On one hand, it exploits the informative semantic knowledge from the logits by dividing the multi-label learning problem into a set of binary classification problems.
On the other hand, it enhances the distinctiveness of the learned feature representations by leveraging the structural information of label-wise embeddings.
arXiv Detail & Related papers (2023-08-12T03:19:08Z) - Self-Directed Linear Classification [50.659479930171585]
In online classification, a learner aims to predict their labels in an online fashion so as to minimize the total number of mistakes.
Here we study the power of choosing the prediction order and establish the first strong separation between worst-order and random-order learning.
arXiv Detail & Related papers (2023-08-06T15:38:44Z) - Combining Metric Learning and Attention Heads For Accurate and Efficient
Multilabel Image Classification [0.0]
We revisit two popular approaches to multilabel classification: transformer-based heads and labels relations information graph processing branches.
Although transformer-based heads are considered to achieve better results than graph-based branches, we argue that with the proper training strategy graph-based methods can demonstrate just a small accuracy drop.
arXiv Detail & Related papers (2022-09-14T12:06:47Z) - Learning with Proper Partial Labels [87.65718705642819]
Partial-label learning is a kind of weakly-supervised learning with inexact labels.
We show that this proper partial-label learning framework includes many previous partial-label learning settings.
We then derive a unified unbiased estimator of the classification risk.
arXiv Detail & Related papers (2021-12-23T01:37:03Z) - Multi-label Classification with Partial Annotations using Class-aware
Selective Loss [14.3159150577502]
Large-scale multi-label classification datasets are commonly partially annotated.
We analyze the partial labeling problem, then propose a solution based on two key ideas.
With our novel approach, we achieve state-of-the-art results on OpenImages dataset.
arXiv Detail & Related papers (2021-10-21T08:10:55Z) - MSE Loss with Outlying Label for Imbalanced Classification [10.305130700118399]
We propose mean squared error (MSE) loss with outlying label for class imbalanced classification.
MSE loss is possible to equalize the number of back propagation for all classes and to learn the feature space considering the relationships between classes as metric learning.
It is possible to create the feature space for separating high-difficulty classes and low-difficulty classes.
arXiv Detail & Related papers (2021-07-06T05:17:00Z) - Theoretical Insights Into Multiclass Classification: A High-dimensional
Asymptotic View [82.80085730891126]
We provide the first modernally precise analysis of linear multiclass classification.
Our analysis reveals that the classification accuracy is highly distribution-dependent.
The insights gained may pave the way for a precise understanding of other classification algorithms.
arXiv Detail & Related papers (2020-11-16T05:17:29Z) - Probabilistic Label Trees for Extreme Multi-label Classification [8.347190888362194]
Problems of extreme multi-label classification (XMLC) are efficiently handled by organizing labels as a tree.
PLTs can be treated as a generalization of hierarchical softmax for multi-label problems.
We introduce the model and discuss training and inference procedures and their computational costs.
We prove a specific equivalence between the fully online algorithm and an algorithm with a tree structure given in advance.
arXiv Detail & Related papers (2020-09-23T15:30:00Z) - Unsupervised Person Re-identification via Multi-label Classification [55.65870468861157]
This paper formulates unsupervised person ReID as a multi-label classification task to progressively seek true labels.
Our method starts by assigning each person image with a single-class label, then evolves to multi-label classification by leveraging the updated ReID model for label prediction.
To boost the ReID model training efficiency in multi-label classification, we propose the memory-based multi-label classification loss (MMCL)
arXiv Detail & Related papers (2020-04-20T12:13:43Z) - Multi-Class Classification from Noisy-Similarity-Labeled Data [98.13491369929798]
We propose a method for learning from only noisy-similarity-labeled data.
We use a noise transition matrix to bridge the class-posterior probability between clean and noisy data.
We build a novel learning system which can assign noise-free class labels for instances.
arXiv Detail & Related papers (2020-02-16T05:10:21Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.