Unsupervised Model Adaptation for Continual Semantic Segmentation
- URL: http://arxiv.org/abs/2009.12518v2
- Date: Sat, 9 Jan 2021 08:09:52 GMT
- Title: Unsupervised Model Adaptation for Continual Semantic Segmentation
- Authors: Serban Stan, Mohammad Rostami
- Abstract summary: We develop an algorithm for adapting a semantic segmentation model that is trained using a labeled source domain to generalize well in an unlabeled target domain.
We provide theoretical analysis and explain conditions under which our algorithm is effective.
Experiments on benchmark adaptation task demonstrate our method achieves competitive performance even compared with joint UDA approaches.
- Score: 15.820660013260584
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We develop an algorithm for adapting a semantic segmentation model that is
trained using a labeled source domain to generalize well in an unlabeled target
domain. A similar problem has been studied extensively in the unsupervised
domain adaptation (UDA) literature, but existing UDA algorithms require access
to both the source domain labeled data and the target domain unlabeled data for
training a domain agnostic semantic segmentation model. Relaxing this
constraint enables a user to adapt pretrained models to generalize in a target
domain, without requiring access to source data. To this end, we learn a
prototypical distribution for the source domain in an intermediate embedding
space. This distribution encodes the abstract knowledge that is learned from
the source domain. We then use this distribution for aligning the target domain
distribution with the source domain distribution in the embedding space. We
provide theoretical analysis and explain conditions under which our algorithm
is effective. Experiments on benchmark adaptation task demonstrate our method
achieves competitive performance even compared with joint UDA approaches.
Related papers
- Online Continual Domain Adaptation for Semantic Image Segmentation Using
Internal Representations [28.549418215123936]
We develop an online UDA algorithm for semantic segmentation of images that improves model generalization on unannotated domains.
We evaluate our approach on well established semantic segmentation datasets and demonstrate it compares favorably against state-of-the-art (SOTA) semantic segmentation methods.
arXiv Detail & Related papers (2024-01-02T04:48:49Z) - Open-Set Domain Adaptation with Visual-Language Foundation Models [51.49854335102149]
Unsupervised domain adaptation (UDA) has proven to be very effective in transferring knowledge from a source domain to a target domain with unlabeled data.
Open-set domain adaptation (ODA) has emerged as a potential solution to identify these classes during the training phase.
arXiv Detail & Related papers (2023-07-30T11:38:46Z) - Source-Free Domain Adaptation via Distribution Estimation [106.48277721860036]
Domain Adaptation aims to transfer the knowledge learned from a labeled source domain to an unlabeled target domain whose data distributions are different.
Recently, Source-Free Domain Adaptation (SFDA) has drawn much attention, which tries to tackle domain adaptation problem without using source data.
In this work, we propose a novel framework called SFDA-DE to address SFDA task via source Distribution Estimation.
arXiv Detail & Related papers (2022-04-24T12:22:19Z) - Domain-Agnostic Prior for Transfer Semantic Segmentation [197.9378107222422]
Unsupervised domain adaptation (UDA) is an important topic in the computer vision community.
We present a mechanism that regularizes cross-domain representation learning with a domain-agnostic prior (DAP)
Our research reveals that UDA benefits much from better proxies, possibly from other data modalities.
arXiv Detail & Related papers (2022-04-06T09:13:25Z) - Instance Level Affinity-Based Transfer for Unsupervised Domain
Adaptation [74.71931918541748]
We propose an instance affinity based criterion for source to target transfer during adaptation, called ILA-DA.
We first propose a reliable and efficient method to extract similar and dissimilar samples across source and target, and utilize a multi-sample contrastive loss to drive the domain alignment process.
We verify the effectiveness of ILA-DA by observing consistent improvements in accuracy over popular domain adaptation approaches on a variety of benchmark datasets.
arXiv Detail & Related papers (2021-04-03T01:33:14Z) - Source-Free Domain Adaptation for Semantic Segmentation [11.722728148523366]
Unsupervised Domain Adaptation (UDA) can tackle the challenge that convolutional neural network-based approaches for semantic segmentation heavily rely on the pixel-level annotated data.
We propose a source-free domain adaptation framework for semantic segmentation, namely SFDA, in which only a well-trained source model and an unlabeled target domain dataset are available for adaptation.
arXiv Detail & Related papers (2021-03-30T14:14:29Z) - Inferring Latent Domains for Unsupervised Deep Domain Adaptation [54.963823285456925]
Unsupervised Domain Adaptation (UDA) refers to the problem of learning a model in a target domain where labeled data are not available.
This paper introduces a novel deep architecture which addresses the problem of UDA by automatically discovering latent domains in visual datasets.
We evaluate our approach on publicly available benchmarks, showing that it outperforms state-of-the-art domain adaptation methods.
arXiv Detail & Related papers (2021-03-25T14:33:33Z) - Unsupervised BatchNorm Adaptation (UBNA): A Domain Adaptation Method for
Semantic Segmentation Without Using Source Domain Representations [35.586031601299034]
Unsupervised BatchNorm Adaptation (UBNA) adapts a given pre-trained model to an unseen target domain.
We partially adapt the normalization layer statistics to the target domain using an exponentially decaying momentum factor.
Compared to standard UDA approaches we report a trade-off between performance and usage of source domain representations.
arXiv Detail & Related papers (2020-11-17T08:37:40Z) - Sequential Model Adaptation Using Domain Agnostic Internal Distributions [31.3178953771424]
We develop an algorithm for sequential adaptation of a classifier that is trained for a source domain to generalize in an unannotated target domain.
We consider that the model has been trained on the source domain annotated data and then it needs to be adapted using the target domain unannotated data when the source domain data is not accessible.
arXiv Detail & Related papers (2020-07-01T03:14:17Z) - Do We Really Need to Access the Source Data? Source Hypothesis Transfer
for Unsupervised Domain Adaptation [102.67010690592011]
Unsupervised adaptationUDA (UDA) aims to leverage the knowledge learned from a labeled source dataset to solve similar tasks in a new unlabeled domain.
Prior UDA methods typically require to access the source data when learning to adapt the model.
This work tackles a practical setting where only a trained source model is available and how we can effectively utilize such a model without source data to solve UDA problems.
arXiv Detail & Related papers (2020-02-20T03:13:58Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.