SCoDA: Self-supervised Continual Domain Adaptation
- URL: http://arxiv.org/abs/2509.09935v1
- Date: Fri, 12 Sep 2025 02:53:03 GMT
- Title: SCoDA: Self-supervised Continual Domain Adaptation
- Authors: Chirayu Agrawal, Snehasis Mukherjee,
- Abstract summary: Source-Free Domain Adaptation (SFDA) addresses the challenge of adapting a model to a target domain without access to the data of the source domain.<n>We introduce Self-supervised Continual Domain Adaptation (SCoDA) to address these limitations.
- Score: 1.2389885407843222
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Source-Free Domain Adaptation (SFDA) addresses the challenge of adapting a model to a target domain without access to the data of the source domain. Prevailing methods typically start with a source model pre-trained with full supervision and distill the knowledge by aligning instance-level features. However, these approaches, relying on cosine similarity over L2-normalized feature vectors, inadvertently discard crucial geometric information about the latent manifold of the source model. We introduce Self-supervised Continual Domain Adaptation (SCoDA) to address these limitations. We make two key departures from standard practice: first, we avoid the reliance on supervised pre-training by initializing the proposed framework with a teacher model pre-trained entirely via self-supervision (SSL). Second, we adapt the principle of geometric manifold alignment to the SFDA setting. The student is trained with a composite objective combining instance-level feature matching with a Space Similarity Loss. To combat catastrophic forgetting, the teacher's parameters are updated via an Exponential Moving Average (EMA) of the student's parameters. Extensive experiments on benchmark datasets demonstrate that SCoDA significantly outperforms state-of-the-art SFDA methods.
Related papers
- GMM-COMET: Continual Source-Free Universal Domain Adaptation via a Mean Teacher and Gaussian Mixture Model-Based Pseudo-Labeling [3.1744658275045103]
Unsupervised domain adaptation tackles the problem that domain shifts between training and test data impair the performance of neural networks in real-world applications.<n>This setting, known as source-free universal domain adaptation (SF-UniDA), has recently gained attention.<n>We present the first study on continual SF-UniDA, where the model must adapt sequentially to a stream of multiple different unlabeled target domains.
arXiv Detail & Related papers (2026-01-16T10:23:19Z) - What Has Been Overlooked in Contrastive Source-Free Domain Adaptation: Leveraging Source-Informed Latent Augmentation within Neighborhood Context [28.634315143647385]
Source-free domain adaptation (SFDA) involves adapting a model originally trained using a labeled dataset to perform effectively on an unlabeled dataset.<n>This adaptation is especially crucial when significant disparities in data distributions exist between the two domains.<n>We introduce a straightforward yet highly effective latent augmentation method tailored for contrastive SFDA.
arXiv Detail & Related papers (2024-12-18T20:09:46Z) - SIDE: Self-supervised Intermediate Domain Exploration for Source-free
Domain Adaptation [36.470026809824674]
Domain adaptation aims to alleviate the domain shift when transferring the knowledge learned from the source domain to the target domain.
Due to privacy issues, source-free domain adaptation (SFDA) has recently become very demanding yet challenging.
This paper proposes self-supervised intermediate domain exploration (SIDE) that effectively bridges the domain gap with an intermediate domain.
arXiv Detail & Related papers (2023-10-13T07:50:37Z) - ViDA: Homeostatic Visual Domain Adapter for Continual Test Time Adaptation [48.039156140237615]
A Continual Test-Time Adaptation task is proposed to adapt the pre-trained model to continually changing target domains.
We design a Visual Domain Adapter (ViDA) for CTTA, explicitly handling both domain-specific and domain-shared knowledge.
Our proposed method achieves state-of-the-art performance in both classification and segmentation CTTA tasks.
arXiv Detail & Related papers (2023-06-07T11:18:53Z) - CoSDA: Continual Source-Free Domain Adaptation [78.47274343972904]
Without access to the source data, source-free domain adaptation (SFDA) transfers knowledge from a source-domain trained model to target domains.
Recently, SFDA has gained popularity due to the need to protect the data privacy of the source domain, but it suffers from catastrophic forgetting on the source domain due to the lack of data.
We propose a continual source-free domain adaptation approach named CoSDA, which employs a dual-speed optimized teacher-student model pair and is equipped with consistency learning capability.
arXiv Detail & Related papers (2023-04-13T15:53:23Z) - CAusal and collaborative proxy-tasKs lEarning for Semi-Supervised Domain
Adaptation [20.589323508870592]
Semi-supervised domain adaptation (SSDA) adapts a learner to a new domain by effectively utilizing source domain data and a few labeled target samples.
We show that the proposed model significantly outperforms SOTA methods in terms of effectiveness and generalisability on SSDA datasets.
arXiv Detail & Related papers (2023-03-30T16:48:28Z) - Source-Free Domain Adaptation via Distribution Estimation [106.48277721860036]
Domain Adaptation aims to transfer the knowledge learned from a labeled source domain to an unlabeled target domain whose data distributions are different.
Recently, Source-Free Domain Adaptation (SFDA) has drawn much attention, which tries to tackle domain adaptation problem without using source data.
In this work, we propose a novel framework called SFDA-DE to address SFDA task via source Distribution Estimation.
arXiv Detail & Related papers (2022-04-24T12:22:19Z) - Semi-supervised Domain Adaptive Structure Learning [72.01544419893628]
Semi-supervised domain adaptation (SSDA) is a challenging problem requiring methods to overcome both 1) overfitting towards poorly annotated data and 2) distribution shift across domains.
We introduce an adaptive structure learning method to regularize the cooperation of SSL and DA.
arXiv Detail & Related papers (2021-12-12T06:11:16Z) - Unified Instance and Knowledge Alignment Pretraining for Aspect-based
Sentiment Analysis [96.53859361560505]
Aspect-based Sentiment Analysis (ABSA) aims to determine the sentiment polarity towards an aspect.
There always exists severe domain shift between the pretraining and downstream ABSA datasets.
We introduce a unified alignment pretraining framework into the vanilla pretrain-finetune pipeline.
arXiv Detail & Related papers (2021-10-26T04:03:45Z) - Source-Free Open Compound Domain Adaptation in Semantic Segmentation [99.82890571842603]
In SF-OCDA, only the source pre-trained model and the target data are available to learn the target model.
We propose the Cross-Patch Style Swap (CPSS) to diversify samples with various patch styles in the feature-level.
Our method produces state-of-the-art results on the C-Driving dataset.
arXiv Detail & Related papers (2021-06-07T08:38:41Z) - Do We Really Need to Access the Source Data? Source Hypothesis Transfer
for Unsupervised Domain Adaptation [102.67010690592011]
Unsupervised adaptationUDA (UDA) aims to leverage the knowledge learned from a labeled source dataset to solve similar tasks in a new unlabeled domain.
Prior UDA methods typically require to access the source data when learning to adapt the model.
This work tackles a practical setting where only a trained source model is available and how we can effectively utilize such a model without source data to solve UDA problems.
arXiv Detail & Related papers (2020-02-20T03:13:58Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.