Distance Based Image Classification: A solution to generative
classification's conundrum?
- URL: http://arxiv.org/abs/2210.01349v1
- Date: Tue, 4 Oct 2022 03:35:13 GMT
- Title: Distance Based Image Classification: A solution to generative
classification's conundrum?
- Authors: Wen-Yan Lin, Siying Liu, Bing Tian Dai, Hongdong Li
- Abstract summary: We argue that discriminative boundaries are counter-intuitive as they define semantics by what-they-are-not.
We propose a new generative model in which semantic factors are accommodated by shell theory's hierarchical generative process.
We use the model to develop a classification scheme which suppresses the impact of noise while preserving semantic cues.
- Score: 70.43638559782597
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Most classifiers rely on discriminative boundaries that separate instances of
each class from everything else. We argue that discriminative boundaries are
counter-intuitive as they define semantics by what-they-are-not; and should be
replaced by generative classifiers which define semantics by what-they-are.
Unfortunately, generative classifiers are significantly less accurate. This may
be caused by the tendency of generative models to focus on easy to model
semantic generative factors and ignore non-semantic factors that are important
but difficult to model. We propose a new generative model in which semantic
factors are accommodated by shell theory's hierarchical generative process and
non-semantic factors by an instance specific noise term. We use the model to
develop a classification scheme which suppresses the impact of noise while
preserving semantic cues. The result is a surprisingly accurate generative
classifier, that takes the form of a modified nearest-neighbor algorithm; we
term it distance classification. Unlike discriminative classifiers, a distance
classifier: defines semantics by what-they-are; is amenable to incremental
updates; and scales well with the number of classes.
Related papers
- Definition generation for lexical semantic change detection [3.7297237438000788]
We use contextualized word definitions generated by large language models as semantic representations in the task of diachronic lexical semantic change detection (LSCD)
In short, generated definitions are used as senses', and the change score of a target word is retrieved by comparing their distributions in two time periods under comparison.
Our approach is on par with or outperforms prior non-supervised LSCD methods.
arXiv Detail & Related papers (2024-06-20T10:13:08Z) - Generative Multi-modal Models are Good Class-Incremental Learners [51.5648732517187]
We propose a novel generative multi-modal model (GMM) framework for class-incremental learning.
Our approach directly generates labels for images using an adapted generative model.
Under the Few-shot CIL setting, we have improved by at least 14% accuracy over all the current state-of-the-art methods with significantly less forgetting.
arXiv Detail & Related papers (2024-03-27T09:21:07Z) - CCPrefix: Counterfactual Contrastive Prefix-Tuning for Many-Class
Classification [57.62886091828512]
We propose a brand-new prefix-tuning method, Counterfactual Contrastive Prefix-tuning (CCPrefix) for many-class classification.
Basically, an instance-dependent soft prefix, derived from fact-counterfactual pairs in the label space, is leveraged to complement the language verbalizers in many-class classification.
arXiv Detail & Related papers (2022-11-11T03:45:59Z) - Deriving discriminative classifiers from generative models [6.939768185086753]
We show how a generative classifier induced from a generative model can also be computed in a discriminative way from the same model.
We illustrate the interest of the new discriminative way of computing classifiers in the Natural Language Processing (NLP) framework.
arXiv Detail & Related papers (2022-01-03T19:18:25Z) - A new class of generative classifiers based on staged tree models [2.66269503676104]
Generative models for classification use the joint probability distribution of the class variable and the features to construct a decision rule.
Here we introduce a new class of generative classifiers, called staged tree classifiers, which formally account for context-specific independence.
An applied analysis to predict the fate of the passengers of the Titanic highlights the insights that the new class of generative classifiers can give.
arXiv Detail & Related papers (2020-12-26T19:30:35Z) - Learning and Evaluating Representations for Deep One-class
Classification [59.095144932794646]
We present a two-stage framework for deep one-class classification.
We first learn self-supervised representations from one-class data, and then build one-class classifiers on learned representations.
In experiments, we demonstrate state-of-the-art performance on visual domain one-class classification benchmarks.
arXiv Detail & Related papers (2020-11-04T23:33:41Z) - Classify and Generate: Using Classification Latent Space Representations
for Image Generations [17.184760662429834]
We propose a discriminative modeling framework that employs manipulated supervised latent representations to reconstruct and generate new samples belonging to a given class.
ReGene has higher classification accuracy than existing conditional generative models while being competitive in terms of FID.
arXiv Detail & Related papers (2020-04-16T09:13:44Z) - Latent Embedding Feedback and Discriminative Features for Zero-Shot
Classification [139.44681304276]
zero-shot learning aims to classify unseen categories for which no data is available during training.
Generative Adrial Networks synthesize unseen class features by leveraging class-specific semantic embeddings.
We propose to enforce semantic consistency at all stages of zero-shot learning: training, feature synthesis and classification.
arXiv Detail & Related papers (2020-03-17T17:34:16Z) - Multi-Class Classification from Noisy-Similarity-Labeled Data [98.13491369929798]
We propose a method for learning from only noisy-similarity-labeled data.
We use a noise transition matrix to bridge the class-posterior probability between clean and noisy data.
We build a novel learning system which can assign noise-free class labels for instances.
arXiv Detail & Related papers (2020-02-16T05:10:21Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.