Domain-Adaptive Few-Shot Learning
- URL: http://arxiv.org/abs/2003.08626v1
- Date: Thu, 19 Mar 2020 08:31:14 GMT
- Title: Domain-Adaptive Few-Shot Learning
- Authors: An Zhao, Mingyu Ding, Zhiwu Lu, Tao Xiang, Yulei Niu, Jiechao Guan,
Ji-Rong Wen, Ping Luo
- Abstract summary: We propose a novel domain-adversarial network (DAPN) model for domain-adaptive few-shot learning.
Our solution is to explicitly enhance the source/target per-class separation before domain-adaptive feature embedding learning.
- Score: 124.51420562201407
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Existing few-shot learning (FSL) methods make the implicit assumption that
the few target class samples are from the same domain as the source class
samples. However, in practice this assumption is often invalid -- the target
classes could come from a different domain. This poses an additional challenge
of domain adaptation (DA) with few training samples. In this paper, the problem
of domain-adaptive few-shot learning (DA-FSL) is tackled, which requires
solving FSL and DA in a unified framework. To this end, we propose a novel
domain-adversarial prototypical network (DAPN) model. It is designed to address
a specific challenge in DA-FSL: the DA objective means that the source and
target data distributions need to be aligned, typically through a shared
domain-adaptive feature embedding space; but the FSL objective dictates that
the target domain per class distribution must be different from that of any
source domain class, meaning aligning the distributions across domains may harm
the FSL performance. How to achieve global domain distribution alignment whilst
maintaining source/target per-class discriminativeness thus becomes the key.
Our solution is to explicitly enhance the source/target per-class separation
before domain-adaptive feature embedding learning in the DAPN, in order to
alleviate the negative effect of domain alignment on FSL. Extensive experiments
show that our DAPN outperforms the state-of-the-art FSL and DA models, as well
as their na\"ive combinations. The code is available at
https://github.com/dingmyu/DAPN.
Related papers
- Cross-Domain Cross-Set Few-Shot Learning via Learning Compact and
Aligned Representations [74.90423071048458]
Few-shot learning aims to recognize novel queries with only a few support samples.
We consider the domain shift problem in FSL and aim to address the domain gap between the support set and the query set.
We propose a novel approach, namely stabPA, to learn prototypical compact and cross-domain aligned representations.
arXiv Detail & Related papers (2022-07-16T03:40:38Z) - Source-Free Domain Adaptation via Distribution Estimation [106.48277721860036]
Domain Adaptation aims to transfer the knowledge learned from a labeled source domain to an unlabeled target domain whose data distributions are different.
Recently, Source-Free Domain Adaptation (SFDA) has drawn much attention, which tries to tackle domain adaptation problem without using source data.
In this work, we propose a novel framework called SFDA-DE to address SFDA task via source Distribution Estimation.
arXiv Detail & Related papers (2022-04-24T12:22:19Z) - Transferrable Contrastive Learning for Visual Domain Adaptation [108.98041306507372]
Transferrable Contrastive Learning (TCL) is a self-supervised learning paradigm tailored for domain adaptation.
TCL penalizes cross-domain intra-class domain discrepancy between source and target through a clean and novel contrastive loss.
The free lunch is, thanks to the incorporation of contrastive learning, TCL relies on a moving-averaged key encoder that naturally achieves a temporally ensembled version of pseudo labels for target data.
arXiv Detail & Related papers (2021-12-14T16:23:01Z) - Prototypical Cross-domain Self-supervised Learning for Few-shot
Unsupervised Domain Adaptation [91.58443042554903]
We propose an end-to-end Prototypical Cross-domain Self-Supervised Learning (PCS) framework for Few-shot Unsupervised Domain Adaptation (FUDA)
PCS not only performs cross-domain low-level feature alignment, but it also encodes and aligns semantic structures in the shared embedding space across domains.
Compared with state-of-the-art methods, PCS improves the mean classification accuracy over different domain pairs on FUDA by 10.5%, 3.5%, 9.0%, and 13.2% on Office, Office-Home, VisDA-2017, and DomainNet, respectively.
arXiv Detail & Related papers (2021-03-31T02:07:42Z) - Revisiting Mid-Level Patterns for Cross-Domain Few-Shot Recognition [31.81367604846625]
Cross-domain few-shot learning is proposed to transfer knowledge from general-domain base classes to special-domain novel classes.
In this paper, we study a challenging subset of CDFSL where the novel classes are in distant domains from base classes.
We propose a residual-prediction task to encourage mid-level features to learn discriminative information of each sample.
arXiv Detail & Related papers (2020-08-07T12:45:39Z) - Few-Shot Learning as Domain Adaptation: Algorithm and Analysis [120.75020271706978]
Few-shot learning uses prior knowledge learned from the seen classes to recognize the unseen classes.
This class-difference-caused distribution shift can be considered as a special case of domain shift.
We propose a prototypical domain adaptation network with attention (DAPNA) to explicitly tackle such a domain shift problem in a meta-learning framework.
arXiv Detail & Related papers (2020-02-06T01:04:53Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.