Combining Domain-Specific Meta-Learners in the Parameter Space for
Cross-Domain Few-Shot Classification
- URL: http://arxiv.org/abs/2011.00179v1
- Date: Sat, 31 Oct 2020 03:33:39 GMT
- Title: Combining Domain-Specific Meta-Learners in the Parameter Space for
Cross-Domain Few-Shot Classification
- Authors: Shuman Peng, Weilian Song, Martin Ester
- Abstract summary: We propose an optimization-based meta-learning method called Combining Domain-Specific Meta-Learners (CosML)
Our experiments show that CosML outperforms a range of state-of-the-art methods and achieves strong cross-domain ability generalization.
- Score: 6.945139522691311
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The goal of few-shot classification is to learn a model that can classify
novel classes using only a few training examples. Despite the promising results
shown by existing meta-learning algorithms in solving the few-shot
classification problem, there still remains an important challenge: how to
generalize to unseen domains while meta-learning on multiple seen domains? In
this paper, we propose an optimization-based meta-learning method, called
Combining Domain-Specific Meta-Learners (CosML), that addresses the
cross-domain few-shot classification problem. CosML first trains a set of
meta-learners, one for each training domain, to learn prior knowledge (i.e.,
meta-parameters) specific to each domain. The domain-specific meta-learners are
then combined in the \emph{parameter space}, by taking a weighted average of
their meta-parameters, which is used as the initialization parameters of a task
network that is quickly adapted to novel few-shot classification tasks in an
unseen domain. Our experiments show that CosML outperforms a range of
state-of-the-art methods and achieves strong cross-domain generalization
ability.
Related papers
- Generalized Few-Shot Continual Learning with Contrastive Mixture of
Adapters [59.82088750033897]
We set up a Generalized FSCL (GFSCL) protocol involving both class- and domain-incremental situations.
We find that common continual learning methods have poor generalization ability on unseen domains.
In this way, we propose a rehearsal-free framework based on Vision Transformer (ViT) named Contrastive Mixture of Adapters (CMoA)
arXiv Detail & Related papers (2023-02-12T15:18:14Z) - Multi-Domain Long-Tailed Learning by Augmenting Disentangled
Representations [80.76164484820818]
There is an inescapable long-tailed class-imbalance issue in many real-world classification problems.
We study this multi-domain long-tailed learning problem and aim to produce a model that generalizes well across all classes and domains.
Built upon a proposed selective balanced sampling strategy, TALLY achieves this by mixing the semantic representation of one example with the domain-associated nuisances of another.
arXiv Detail & Related papers (2022-10-25T21:54:26Z) - Adversarial Feature Augmentation for Cross-domain Few-shot
Classification [2.68796389443975]
We propose a novel adversarial feature augmentation (AFA) method to bridge the domain gap in few-shot learning.
The proposed method is a plug-and-play module that can be easily integrated into existing few-shot learning methods.
arXiv Detail & Related papers (2022-08-23T15:10:22Z) - Set-based Meta-Interpolation for Few-Task Meta-Learning [79.4236527774689]
We propose a novel domain-agnostic task augmentation method, Meta-Interpolation, to densify the meta-training task distribution.
We empirically validate the efficacy of Meta-Interpolation on eight datasets spanning across various domains.
arXiv Detail & Related papers (2022-05-20T06:53:03Z) - Improving Task Adaptation for Cross-domain Few-shot Learning [41.821234589075445]
Cross-domain few-shot classification aims to learn a classifier from previously unseen classes and domains with few labeled samples.
We show that parametric adapters attached to convolutional layers with residual connections performs the best.
arXiv Detail & Related papers (2021-07-01T10:47:06Z) - How to Train Your MAML to Excel in Few-Shot Classification [26.51244463209443]
We show how to train MAML to excel in few-shot classification.
Our approach, which we name UNICORN-MAML, performs on a par with or even outperforms state-of-the-art algorithms.
arXiv Detail & Related papers (2021-06-30T17:56:15Z) - Cluster, Split, Fuse, and Update: Meta-Learning for Open Compound Domain
Adaptive Semantic Segmentation [102.42638795864178]
We propose a principled meta-learning based approach to OCDA for semantic segmentation.
We cluster target domain into multiple sub-target domains by image styles, extracted in an unsupervised manner.
A meta-learner is thereafter deployed to learn to fuse sub-target domain-specific predictions, conditioned upon the style code.
We learn to online update the model by model-agnostic meta-learning (MAML) algorithm, thus to further improve generalization.
arXiv Detail & Related papers (2020-12-15T13:21:54Z) - Learning to Generalize Unseen Domains via Memory-based Multi-Source
Meta-Learning for Person Re-Identification [59.326456778057384]
We propose the Memory-based Multi-Source Meta-Learning framework to train a generalizable model for unseen domains.
We also present a meta batch normalization layer (MetaBN) to diversify meta-test features.
Experiments demonstrate that our M$3$L can effectively enhance the generalization ability of the model for unseen domains.
arXiv Detail & Related papers (2020-12-01T11:38:16Z) - Incremental Meta-Learning via Indirect Discriminant Alignment [118.61152684795178]
We develop a notion of incremental learning during the meta-training phase of meta-learning.
Our approach performs favorably at test time as compared to training a model with the full meta-training set.
arXiv Detail & Related papers (2020-02-11T01:39:12Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.