Unsupervised Few-shot Learning via Deep Laplacian Eigenmaps
- URL: http://arxiv.org/abs/2210.03595v1
- Date: Fri, 7 Oct 2022 14:53:03 GMT
- Title: Unsupervised Few-shot Learning via Deep Laplacian Eigenmaps
- Authors: Kuilin Chen, Chi-Guhn Lee
- Abstract summary: We present an unsupervised few-shot learning method via deep Laplacian eigenmaps.
Our method learns representation from unlabeled data by grouping similar samples together.
We analytically show how deep Laplacian eigenmaps avoid collapsed representation in unsupervised learning.
- Score: 13.6555672824229
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Learning a new task from a handful of examples remains an open challenge in
machine learning. Despite the recent progress in few-shot learning, most
methods rely on supervised pretraining or meta-learning on labeled
meta-training data and cannot be applied to the case where the pretraining data
is unlabeled. In this study, we present an unsupervised few-shot learning
method via deep Laplacian eigenmaps. Our method learns representation from
unlabeled data by grouping similar samples together and can be intuitively
interpreted by random walks on augmented training data. We analytically show
how deep Laplacian eigenmaps avoid collapsed representation in unsupervised
learning without explicit comparison between positive and negative samples. The
proposed method significantly closes the performance gap between supervised and
unsupervised few-shot learning. Our method also achieves comparable performance
to current state-of-the-art self-supervised learning methods under linear
evaluation protocol.
Related papers
- Variational Self-Supervised Contrastive Learning Using Beta Divergence [0.0]
We present a contrastive self-supervised learning method which is robust to data noise, grounded in the domain of variational methods.
We demonstrate the effectiveness of the proposed method through rigorous experiments including linear evaluation and fine-tuning scenarios with multi-label datasets in the face understanding domain.
arXiv Detail & Related papers (2023-09-05T17:21:38Z) - Adaptive Negative Evidential Deep Learning for Open-set Semi-supervised Learning [69.81438976273866]
Open-set semi-supervised learning (Open-set SSL) considers a more practical scenario, where unlabeled data and test data contain new categories (outliers) not observed in labeled data (inliers)
We introduce evidential deep learning (EDL) as an outlier detector to quantify different types of uncertainty, and design different uncertainty metrics for self-training and inference.
We propose a novel adaptive negative optimization strategy, making EDL more tailored to the unlabeled dataset containing both inliers and outliers.
arXiv Detail & Related papers (2023-03-21T09:07:15Z) - An Embarrassingly Simple Approach to Semi-Supervised Few-Shot Learning [58.59343434538218]
We propose a simple but quite effective approach to predict accurate negative pseudo-labels of unlabeled data from an indirect learning perspective.
Our approach can be implemented in just few lines of code by only using off-the-shelf operations.
arXiv Detail & Related papers (2022-09-28T02:11:34Z) - Boundary-aware Information Maximization for Self-supervised Medical
Image Segmentation [13.828282295918628]
We propose a novel unsupervised pre-training framework that avoids the drawback of contrastive learning.
Experimental results on two benchmark medical segmentation datasets reveal our method's effectiveness when few annotated images are available.
arXiv Detail & Related papers (2022-02-04T20:18:00Z) - STDP enhances learning by backpropagation in a spiking neural network [0.0]
The proposed method improves the accuracy without additional labeling when a small amount of labeled data is used.
It is possible to implement the proposed learning method for event-driven systems.
arXiv Detail & Related papers (2021-02-21T06:55:02Z) - Unsupervised Noisy Tracklet Person Re-identification [100.85530419892333]
We present a novel selective tracklet learning (STL) approach that can train discriminative person re-id models from unlabelled tracklet data.
This avoids the tedious and costly process of exhaustively labelling person image/tracklet true matching pairs across camera views.
Our method is particularly more robust against arbitrary noisy data of raw tracklets therefore scalable to learning discriminative models from unconstrained tracking data.
arXiv Detail & Related papers (2021-01-16T07:31:00Z) - Take More Positives: An Empirical Study of Contrastive Learing in
Unsupervised Person Re-Identification [43.11532800327356]
Unsupervised person re-ID aims at closing the performance gap to supervised methods.
We show that the reason why they are successful is not only their label generation mechanisms, but also their unexplored designs.
We propose a contrastive learning method without a memory back for unsupervised person re-ID.
arXiv Detail & Related papers (2021-01-12T08:06:11Z) - A Sober Look at the Unsupervised Learning of Disentangled
Representations and their Evaluation [63.042651834453544]
We show that the unsupervised learning of disentangled representations is impossible without inductive biases on both the models and the data.
We observe that while the different methods successfully enforce properties "encouraged" by the corresponding losses, well-disentangled models seemingly cannot be identified without supervision.
Our results suggest that future work on disentanglement learning should be explicit about the role of inductive biases and (implicit) supervision.
arXiv Detail & Related papers (2020-10-27T10:17:15Z) - Contrastive Learning with Hard Negative Samples [80.12117639845678]
We develop a new family of unsupervised sampling methods for selecting hard negative samples.
A limiting case of this sampling results in a representation that tightly clusters each class, and pushes different classes as far apart as possible.
The proposed method improves downstream performance across multiple modalities, requires only few additional lines of code to implement, and introduces no computational overhead.
arXiv Detail & Related papers (2020-10-09T14:18:53Z) - Deep Semi-supervised Knowledge Distillation for Overlapping Cervical
Cell Instance Segmentation [54.49894381464853]
We propose to leverage both labeled and unlabeled data for instance segmentation with improved accuracy by knowledge distillation.
We propose a novel Mask-guided Mean Teacher framework with Perturbation-sensitive Sample Mining.
Experiments show that the proposed method improves the performance significantly compared with the supervised method learned from labeled data only.
arXiv Detail & Related papers (2020-07-21T13:27:09Z) - Self-Supervised Prototypical Transfer Learning for Few-Shot
Classification [11.96734018295146]
Self-supervised transfer learning approach ProtoTransfer outperforms state-of-the-art unsupervised meta-learning methods on few-shot tasks.
In few-shot experiments with domain shift, our approach even has comparable performance to supervised methods, but requires orders of magnitude fewer labels.
arXiv Detail & Related papers (2020-06-19T19:00:11Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.