Memory Consistent Unsupervised Off-the-Shelf Model Adaptation for
Source-Relaxed Medical Image Segmentation
- URL: http://arxiv.org/abs/2209.07910v1
- Date: Fri, 16 Sep 2022 13:13:50 GMT
- Title: Memory Consistent Unsupervised Off-the-Shelf Model Adaptation for
Source-Relaxed Medical Image Segmentation
- Authors: Xiaofeng Liu, Fangxu Xing, Georges El Fakhri, Jonghye Woo
- Abstract summary: Unsupervised domain adaptation (UDA) has been a vital protocol for migrating information learned from a labeled source domain to an unlabeled heterogeneous target domain.
We propose "off-the-shelf (OS)" UDA (OSUDA), aimed at image segmentation, by adapting an OS segmentor trained in a source domain to a target domain, in the absence of source domain data in adaptation.
- Score: 13.260109561599904
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Unsupervised domain adaptation (UDA) has been a vital protocol for migrating
information learned from a labeled source domain to facilitate the
implementation in an unlabeled heterogeneous target domain. Although UDA is
typically jointly trained on data from both domains, accessing the labeled
source domain data is often restricted, due to concerns over patient data
privacy or intellectual property. To sidestep this, we propose "off-the-shelf
(OS)" UDA (OSUDA), aimed at image segmentation, by adapting an OS segmentor
trained in a source domain to a target domain, in the absence of source domain
data in adaptation. Toward this goal, we aim to develop a novel batch-wise
normalization (BN) statistics adaptation framework. In particular, we gradually
adapt the domain-specific low-order BN statistics, e.g., mean and variance,
through an exponential momentum decay strategy, while explicitly enforcing the
consistency of the domain shareable high-order BN statistics, e.g., scaling and
shifting factors, via our optimization objective. We also adaptively quantify
the channel-wise transferability to gauge the importance of each channel, via
both low-order statistics divergence and a scaling factor.~Furthermore, we
incorporate unsupervised self-entropy minimization into our framework to boost
performance alongside a novel queued, memory-consistent self-training strategy
to utilize the reliable pseudo label for stable and efficient unsupervised
adaptation. We evaluated our OSUDA-based framework on both cross-modality and
cross-subtype brain tumor segmentation and cardiac MR to CT segmentation tasks.
Our experimental results showed that our memory consistent OSUDA performs
better than existing source-relaxed UDA methods and yields similar performance
to UDA methods with source data.
Related papers
- Source-Free Domain Adaptation for Medical Image Segmentation via
Prototype-Anchored Feature Alignment and Contrastive Learning [57.43322536718131]
We present a two-stage source-free domain adaptation (SFDA) framework for medical image segmentation.
In the prototype-anchored feature alignment stage, we first utilize the weights of the pre-trained pixel-wise classifier as source prototypes.
Then, we introduce the bi-directional transport to align the target features with class prototypes by minimizing its expected cost.
arXiv Detail & Related papers (2023-07-19T06:07:12Z) - Divide and Contrast: Source-free Domain Adaptation via Adaptive
Contrastive Learning [122.62311703151215]
Divide and Contrast (DaC) aims to connect the good ends of both worlds while bypassing their limitations.
DaC divides the target data into source-like and target-specific samples, where either group of samples is treated with tailored goals.
We further align the source-like domain with the target-specific samples using a memory bank-based Maximum Mean Discrepancy (MMD) loss to reduce the distribution mismatch.
arXiv Detail & Related papers (2022-11-12T09:21:49Z) - Source-free Unsupervised Domain Adaptation for Blind Image Quality
Assessment [20.28784839680503]
Existing learning-based methods for blind image quality assessment (BIQA) are heavily dependent on large amounts of annotated training data.
In this paper, we take the first step towards the source-free unsupervised domain adaptation (SFUDA) in a simple yet efficient manner.
We present a group of well-designed self-supervised objectives to guide the adaptation of the BN affine parameters towards the target domain.
arXiv Detail & Related papers (2022-07-17T09:42:36Z) - Source-Free Domain Adaptation via Distribution Estimation [106.48277721860036]
Domain Adaptation aims to transfer the knowledge learned from a labeled source domain to an unlabeled target domain whose data distributions are different.
Recently, Source-Free Domain Adaptation (SFDA) has drawn much attention, which tries to tackle domain adaptation problem without using source data.
In this work, we propose a novel framework called SFDA-DE to address SFDA task via source Distribution Estimation.
arXiv Detail & Related papers (2022-04-24T12:22:19Z) - MT-UDA: Towards Unsupervised Cross-modality Medical Image Segmentation
with Limited Source Labels [15.01727721628536]
Deep unsupervised domain adaptation (UDA) can leverage well-established source domain annotations and abundant target domain data.
UDA methods suffer from severe performance degradation when source domain annotations are scarce.
We propose a new label-efficient UDA framework, termed MT-UDA, in which the student model trained with limited source labels learns from unlabeled data of both domains in a semi-supervised manner.
arXiv Detail & Related papers (2022-03-23T14:51:00Z) - Adapting Off-the-Shelf Source Segmenter for Target Medical Image
Segmentation [12.703234995718372]
Unsupervised domain adaptation (UDA) aims to transfer knowledge learned from a labeled source domain to an unlabeled and unseen target domain.
Access to the source domain data at the adaptation stage is often limited, due to data storage or privacy issues.
We propose to adapt an off-the-shelf" segmentation model pre-trained in the source domain to the target domain.
arXiv Detail & Related papers (2021-06-23T16:16:55Z) - Instance Level Affinity-Based Transfer for Unsupervised Domain
Adaptation [74.71931918541748]
We propose an instance affinity based criterion for source to target transfer during adaptation, called ILA-DA.
We first propose a reliable and efficient method to extract similar and dissimilar samples across source and target, and utilize a multi-sample contrastive loss to drive the domain alignment process.
We verify the effectiveness of ILA-DA by observing consistent improvements in accuracy over popular domain adaptation approaches on a variety of benchmark datasets.
arXiv Detail & Related papers (2021-04-03T01:33:14Z) - Source-Free Domain Adaptation for Semantic Segmentation [11.722728148523366]
Unsupervised Domain Adaptation (UDA) can tackle the challenge that convolutional neural network-based approaches for semantic segmentation heavily rely on the pixel-level annotated data.
We propose a source-free domain adaptation framework for semantic segmentation, namely SFDA, in which only a well-trained source model and an unlabeled target domain dataset are available for adaptation.
arXiv Detail & Related papers (2021-03-30T14:14:29Z) - Adapt Everywhere: Unsupervised Adaptation of Point-Clouds and Entropy
Minimisation for Multi-modal Cardiac Image Segmentation [10.417009344120917]
We present a novel UDA method for multi-modal cardiac image segmentation.
The proposed method is based on adversarial learning and adapts network features between source and target domain in different spaces.
We validated our method on two cardiac datasets by adapting from the annotated source domain to the unannotated target domain.
arXiv Detail & Related papers (2021-03-15T08:59:44Z) - Source Data-absent Unsupervised Domain Adaptation through Hypothesis
Transfer and Labeling Transfer [137.36099660616975]
Unsupervised adaptation adaptation (UDA) aims to transfer knowledge from a related but different well-labeled source domain to a new unlabeled target domain.
Most existing UDA methods require access to the source data, and thus are not applicable when the data are confidential and not shareable due to privacy concerns.
This paper aims to tackle a realistic setting with only a classification model available trained over, instead of accessing to the source data.
arXiv Detail & Related papers (2020-12-14T07:28:50Z) - Do We Really Need to Access the Source Data? Source Hypothesis Transfer
for Unsupervised Domain Adaptation [102.67010690592011]
Unsupervised adaptationUDA (UDA) aims to leverage the knowledge learned from a labeled source dataset to solve similar tasks in a new unlabeled domain.
Prior UDA methods typically require to access the source data when learning to adapt the model.
This work tackles a practical setting where only a trained source model is available and how we can effectively utilize such a model without source data to solve UDA problems.
arXiv Detail & Related papers (2020-02-20T03:13:58Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.