Retrieval-Augmented Classification with Decoupled Representation
- URL: http://arxiv.org/abs/2303.13065v2
- Date: Tue, 11 Apr 2023 09:13:30 GMT
- Title: Retrieval-Augmented Classification with Decoupled Representation
- Authors: Xinnian Liang, Shuangzhi Wu, Hui Huang, Jiaqi Bai, Chao Bian, Zhoujun
Li
- Abstract summary: We propose a $k$-nearest-neighbor (KNN)-based method for retrieval augmented classifications.
We find that shared representation for classification and retrieval hurts performance and leads to training instability.
We evaluate our method on a wide range of classification datasets.
- Score: 31.662843145399044
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Retrieval augmented methods have shown promising results in various
classification tasks. However, existing methods focus on retrieving extra
context to enrich the input, which is noise sensitive and non-expandable. In
this paper, following this line, we propose a $k$-nearest-neighbor (KNN) -based
method for retrieval augmented classifications, which interpolates the
predicted label distribution with retrieved instances' label distributions.
Different from the standard KNN process, we propose a decoupling mechanism as
we find that shared representation for classification and retrieval hurts
performance and leads to training instability. We evaluate our method on a wide
range of classification datasets. Experimental results demonstrate the
effectiveness and robustness of our proposed method. We also conduct extra
experiments to analyze the contributions of different components in our
model.\footnote{\url{https://github.com/xnliang98/knn-cls-w-decoupling}}
Related papers
- A Generic Method for Fine-grained Category Discovery in Natural Language Texts [38.297873969795546]
We introduce a method that successfully detects fine-grained clusters of semantically similar texts guided by a novel objective function.
The method uses semantic similarities in a logarithmic space to guide sample distributions in the Euclidean space.
We also propose a centroid inference mechanism to support real-time applications.
arXiv Detail & Related papers (2024-06-18T23:27:46Z) - Cluster-Aware Similarity Diffusion for Instance Retrieval [64.40171728912702]
Diffusion-based re-ranking is a common method used for retrieving instances by performing similarity propagation in a nearest neighbor graph.
We propose a novel Cluster-Aware Similarity (CAS) diffusion for instance retrieval.
arXiv Detail & Related papers (2024-06-04T14:19:50Z) - Convolutional autoencoder-based multimodal one-class classification [80.52334952912808]
One-class classification refers to approaches of learning using data from a single class only.
We propose a deep learning one-class classification method suitable for multimodal data.
arXiv Detail & Related papers (2023-09-25T12:31:18Z) - An Upper Bound for the Distribution Overlap Index and Its Applications [18.481370450591317]
This paper proposes an easy-to-compute upper bound for the overlap index between two probability distributions.
The proposed bound shows its value in one-class classification and domain shift analysis.
Our work shows significant promise toward broadening the applications of overlap-based metrics.
arXiv Detail & Related papers (2022-12-16T20:02:03Z) - Parametric Classification for Generalized Category Discovery: A Baseline
Study [70.73212959385387]
Generalized Category Discovery (GCD) aims to discover novel categories in unlabelled datasets using knowledge learned from labelled samples.
We investigate the failure of parametric classifiers, verify the effectiveness of previous design choices when high-quality supervision is available, and identify unreliable pseudo-labels as a key problem.
We propose a simple yet effective parametric classification method that benefits from entropy regularisation, achieves state-of-the-art performance on multiple GCD benchmarks and shows strong robustness to unknown class numbers.
arXiv Detail & Related papers (2022-11-21T18:47:11Z) - Fine-Grained Visual Classification using Self Assessment Classifier [12.596520707449027]
Extracting discriminative features plays a crucial role in the fine-grained visual classification task.
In this paper, we introduce a Self Assessment, which simultaneously leverages the representation of the image and top-k prediction classes.
We show that our method achieves new state-of-the-art results on CUB200-2011, Stanford Dog, and FGVC Aircraft datasets.
arXiv Detail & Related papers (2022-05-21T07:41:27Z) - Exploring Category-correlated Feature for Few-shot Image Classification [27.13708881431794]
We present a simple yet effective feature rectification method by exploring the category correlation between novel and base classes as the prior knowledge.
The proposed approach consistently obtains considerable performance gains on three widely used benchmarks.
arXiv Detail & Related papers (2021-12-14T08:25:24Z) - No Fear of Heterogeneity: Classifier Calibration for Federated Learning
with Non-IID Data [78.69828864672978]
A central challenge in training classification models in the real-world federated system is learning with non-IID data.
We propose a novel and simple algorithm called Virtual Representations (CCVR), which adjusts the classifier using virtual representations sampled from an approximated ssian mixture model.
Experimental results demonstrate that CCVR state-of-the-art performance on popular federated learning benchmarks including CIFAR-10, CIFAR-100, and CINIC-10.
arXiv Detail & Related papers (2021-06-09T12:02:29Z) - Open-Set Recognition with Gaussian Mixture Variational Autoencoders [91.3247063132127]
In inference, open-set classification is to either classify a sample into a known class from training or reject it as an unknown class.
We train our model to cooperatively learn reconstruction and perform class-based clustering in the latent space.
Our model achieves more accurate and robust open-set classification results, with an average F1 improvement of 29.5%.
arXiv Detail & Related papers (2020-06-03T01:15:19Z) - Learning with Out-of-Distribution Data for Audio Classification [60.48251022280506]
We show that detecting and relabelling certain OOD instances, rather than discarding them, can have a positive effect on learning.
The proposed method is shown to improve the performance of convolutional neural networks by a significant margin.
arXiv Detail & Related papers (2020-02-11T21:08:06Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.