Source-Free Open Compound Domain Adaptation in Semantic Segmentation
- URL: http://arxiv.org/abs/2106.03422v1
- Date: Mon, 7 Jun 2021 08:38:41 GMT
- Title: Source-Free Open Compound Domain Adaptation in Semantic Segmentation
- Authors: Yuyang Zhao, Zhun Zhong, Zhiming Luo, Gim Hee Lee, Nicu Sebe
- Abstract summary: In SF-OCDA, only the source pre-trained model and the target data are available to learn the target model.
We propose the Cross-Patch Style Swap (CPSS) to diversify samples with various patch styles in the feature-level.
Our method produces state-of-the-art results on the C-Driving dataset.
- Score: 99.82890571842603
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In this work, we introduce a new concept, named source-free open compound
domain adaptation (SF-OCDA), and study it in semantic segmentation. SF-OCDA is
more challenging than the traditional domain adaptation but it is more
practical. It jointly considers (1) the issues of data privacy and data storage
and (2) the scenario of multiple target domains and unseen open domains. In
SF-OCDA, only the source pre-trained model and the target data are available to
learn the target model. The model is evaluated on the samples from the target
and unseen open domains. To solve this problem, we present an effective
framework by separating the training process into two stages: (1) pre-training
a generalized source model and (2) adapting a target model with self-supervised
learning. In our framework, we propose the Cross-Patch Style Swap (CPSS) to
diversify samples with various patch styles in the feature-level, which can
benefit the training of both stages. First, CPSS can significantly improve the
generalization ability of the source model, providing more accurate
pseudo-labels for the latter stage. Second, CPSS can reduce the influence of
noisy pseudo-labels and also avoid the model overfitting to the target domain
during self-supervised learning, consistently boosting the performance on the
target and open domains. Experiments demonstrate that our method produces
state-of-the-art results on the C-Driving dataset. Furthermore, our model also
achieves the leading performance on CityScapes for domain generalization.
Related papers
- PiPa++: Towards Unification of Domain Adaptive Semantic Segmentation via Self-supervised Learning [34.786268652516355]
Unsupervised domain adaptive segmentation aims to improve the segmentation accuracy of models on target domains without relying on labeled data from those domains.
It seeks to align the feature representations of the source domain (where labeled data is available) and the target domain (where only unlabeled data is present)
arXiv Detail & Related papers (2024-07-24T08:53:29Z) - High-order Neighborhoods Know More: HyperGraph Learning Meets Source-free Unsupervised Domain Adaptation [34.08681468394247]
Source-free Unsupervised Domain Adaptation aims to classify target samples by only accessing a pre-trained source model and unlabelled target samples.
Existing methods normally exploit the pair-wise relation among target samples and attempt to discover their correlations by clustering these samples based on semantic features.
We propose a new SFDA method that exploits the high-order neighborhood relation and explicitly takes the domain shift effect into account.
arXiv Detail & Related papers (2024-05-11T05:07:43Z) - Collaborative Multi-source Domain Adaptation Through Optimal Transport [0.0]
Multi-source Domain Adaptation (MDA) seeks to adapt models trained on data from multiple labeled source domains to perform effectively on an unlabeled target domain data.
We introduce Collaborative MDA Through Optimal Transport (CMDA-OT), a novel framework consisting of two key phases.
arXiv Detail & Related papers (2024-04-09T20:06:25Z) - SIDE: Self-supervised Intermediate Domain Exploration for Source-free
Domain Adaptation [36.470026809824674]
Domain adaptation aims to alleviate the domain shift when transferring the knowledge learned from the source domain to the target domain.
Due to privacy issues, source-free domain adaptation (SFDA) has recently become very demanding yet challenging.
This paper proposes self-supervised intermediate domain exploration (SIDE) that effectively bridges the domain gap with an intermediate domain.
arXiv Detail & Related papers (2023-10-13T07:50:37Z) - Open-Set Domain Adaptation with Visual-Language Foundation Models [51.49854335102149]
Unsupervised domain adaptation (UDA) has proven to be very effective in transferring knowledge from a source domain to a target domain with unlabeled data.
Open-set domain adaptation (ODA) has emerged as a potential solution to identify these classes during the training phase.
arXiv Detail & Related papers (2023-07-30T11:38:46Z) - IDA: Informed Domain Adaptive Semantic Segmentation [51.12107564372869]
We propose an Domain Informed Adaptation (IDA) model, a self-training framework that mixes the data based on class-level segmentation performance.
In our IDA model, the class-level performance is tracked by an expected confidence score (ECS) and we then use a dynamic schedule to determine the mixing ratio for data in different domains.
Our proposed method is able to outperform the state-of-the-art UDA-SS method by a margin of 1.1 mIoU in the adaptation of GTA-V to Cityscapes and of 0.9 mIoU in the adaptation of SYNTHIA to City
arXiv Detail & Related papers (2023-03-05T18:16:34Z) - Source-Free Domain Adaptation via Distribution Estimation [106.48277721860036]
Domain Adaptation aims to transfer the knowledge learned from a labeled source domain to an unlabeled target domain whose data distributions are different.
Recently, Source-Free Domain Adaptation (SFDA) has drawn much attention, which tries to tackle domain adaptation problem without using source data.
In this work, we propose a novel framework called SFDA-DE to address SFDA task via source Distribution Estimation.
arXiv Detail & Related papers (2022-04-24T12:22:19Z) - Unified Instance and Knowledge Alignment Pretraining for Aspect-based
Sentiment Analysis [96.53859361560505]
Aspect-based Sentiment Analysis (ABSA) aims to determine the sentiment polarity towards an aspect.
There always exists severe domain shift between the pretraining and downstream ABSA datasets.
We introduce a unified alignment pretraining framework into the vanilla pretrain-finetune pipeline.
arXiv Detail & Related papers (2021-10-26T04:03:45Z) - Unsupervised Domain Adaptation with Multiple Domain Discriminators and
Adaptive Self-Training [22.366638308792734]
Unsupervised Domain Adaptation (UDA) aims at improving the generalization capability of a model trained on a source domain to perform well on a target domain for which no labeled data is available.
We propose an approach to adapt a deep neural network trained on synthetic data to real scenes addressing the domain shift between the two different data distributions.
arXiv Detail & Related papers (2020-04-27T11:48:03Z) - Alleviating Semantic-level Shift: A Semi-supervised Domain Adaptation
Method for Semantic Segmentation [97.8552697905657]
A key challenge of this task is how to alleviate the data distribution discrepancy between the source and target domains.
We propose Alleviating Semantic-level Shift (ASS), which can successfully promote the distribution consistency from both global and local views.
We apply our ASS to two domain adaptation tasks, from GTA5 to Cityscapes and from Synthia to Cityscapes.
arXiv Detail & Related papers (2020-04-02T03:25:05Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.