Taxonomy Adaptive Cross-Domain Adaptation in Medical Imaging via
Optimization Trajectory Distillation
- URL: http://arxiv.org/abs/2307.14709v1
- Date: Thu, 27 Jul 2023 08:58:05 GMT
- Title: Taxonomy Adaptive Cross-Domain Adaptation in Medical Imaging via
Optimization Trajectory Distillation
- Authors: Jianan Fan, Dongnan Liu, Hang Chang, Heng Huang, Mei Chen, and Weidong
Cai
- Abstract summary: The success of automated medical image analysis depends on large-scale and expert-annotated training sets.
Unsupervised domain adaptation (UDA) has been raised as a promising approach to alleviate the burden of labeled data collection.
We propose optimization trajectory distillation, a unified approach to address the two technical challenges from a new perspective.
- Score: 73.83178465971552
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: The success of automated medical image analysis depends on large-scale and
expert-annotated training sets. Unsupervised domain adaptation (UDA) has been
raised as a promising approach to alleviate the burden of labeled data
collection. However, they generally operate under the closed-set adaptation
setting assuming an identical label set between the source and target domains,
which is over-restrictive in clinical practice where new classes commonly exist
across datasets due to taxonomic inconsistency. While several methods have been
presented to tackle both domain shifts and incoherent label sets, none of them
take into account the common characteristics of the two issues and consider the
learning dynamics along network training. In this work, we propose optimization
trajectory distillation, a unified approach to address the two technical
challenges from a new perspective. It exploits the low-rank nature of gradient
space and devises a dual-stream distillation algorithm to regularize the
learning dynamics of insufficiently annotated domain and classes with the
external guidance obtained from reliable sources. Our approach resolves the
issue of inadequate navigation along network optimization, which is the major
obstacle in the taxonomy adaptive cross-domain adaptation scenario. We evaluate
the proposed method extensively on several tasks towards various endpoints with
clinical and open-world significance. The results demonstrate its effectiveness
and improvements over previous methods.
Related papers
- Domain-Adaptive Learning: Unsupervised Adaptation for Histology Images
with Improved Loss Function Combination [3.004632712148892]
This paper presents a novel approach for unsupervised domain adaptation (UDA) targeting H&E stained histology images.
Our approach proposes a novel loss function along with carefully selected existing loss functions tailored to address the challenges specific to histology images.
The proposed method is extensively evaluated in accuracy, robustness, and generalization, surpassing state-of-the-art techniques for histology images.
arXiv Detail & Related papers (2023-09-29T12:11:16Z) - Joint Attention-Driven Domain Fusion and Noise-Tolerant Learning for
Multi-Source Domain Adaptation [2.734665397040629]
Multi-source Unsupervised Domain Adaptation transfers knowledge from multiple source domains with labeled data to an unlabeled target domain.
The distribution discrepancy between different domains and the noisy pseudo-labels in the target domain both lead to performance bottlenecks.
We propose an approach that integrates Attention-driven Domain fusion and Noise-Tolerant learning (ADNT) to address the two issues mentioned above.
arXiv Detail & Related papers (2022-08-05T01:08:41Z) - Semi-supervised Domain Adaptive Structure Learning [72.01544419893628]
Semi-supervised domain adaptation (SSDA) is a challenging problem requiring methods to overcome both 1) overfitting towards poorly annotated data and 2) distribution shift across domains.
We introduce an adaptive structure learning method to regularize the cooperation of SSL and DA.
arXiv Detail & Related papers (2021-12-12T06:11:16Z) - Unsupervised Noise Adaptive Speech Enhancement by
Discriminator-Constrained Optimal Transport [25.746489468835357]
This paper presents a novel discriminator-constrained optimal transport network (DOTN) that performs unsupervised domain adaptation for speech enhancement (SE)
The DOTN aims to estimate clean references of noisy speech in a target domain, by exploiting the knowledge available from the source domain.
arXiv Detail & Related papers (2021-11-11T17:15:37Z) - Improving Transferability of Domain Adaptation Networks Through Domain
Alignment Layers [1.3766148734487902]
Multi-source unsupervised domain adaptation (MSDA) aims at learning a predictor for an unlabeled domain by assigning weak knowledge from a bag of source models.
We propose to embed Multi-Source version of DomaIn Alignment Layers (MS-DIAL) at different levels of the predictor.
Our approach can improve state-of-the-art MSDA methods, yielding relative gains of up to +30.64% on their classification accuracies.
arXiv Detail & Related papers (2021-09-06T18:41:19Z) - Contrastive Learning and Self-Training for Unsupervised Domain
Adaptation in Semantic Segmentation [71.77083272602525]
UDA attempts to provide efficient knowledge transfer from a labeled source domain to an unlabeled target domain.
We propose a contrastive learning approach that adapts category-wise centroids across domains.
We extend our method with self-training, where we use a memory-efficient temporal ensemble to generate consistent and reliable pseudo-labels.
arXiv Detail & Related papers (2021-05-05T11:55:53Z) - Selective Pseudo-Labeling with Reinforcement Learning for
Semi-Supervised Domain Adaptation [116.48885692054724]
We propose a reinforcement learning based selective pseudo-labeling method for semi-supervised domain adaptation.
We develop a deep Q-learning model to select both accurate and representative pseudo-labeled instances.
Our proposed method is evaluated on several benchmark datasets for SSDA, and demonstrates superior performance to all the comparison methods.
arXiv Detail & Related papers (2020-12-07T03:37:38Z) - A Review of Single-Source Deep Unsupervised Visual Domain Adaptation [81.07994783143533]
Large-scale labeled training datasets have enabled deep neural networks to excel across a wide range of benchmark vision tasks.
In many applications, it is prohibitively expensive and time-consuming to obtain large quantities of labeled data.
To cope with limited labeled training data, many have attempted to directly apply models trained on a large-scale labeled source domain to another sparsely labeled or unlabeled target domain.
arXiv Detail & Related papers (2020-09-01T00:06:50Z) - Adaptively-Accumulated Knowledge Transfer for Partial Domain Adaptation [66.74638960925854]
Partial domain adaptation (PDA) deals with a realistic and challenging problem when the source domain label space substitutes the target domain.
We propose an Adaptively-Accumulated Knowledge Transfer framework (A$2$KT) to align the relevant categories across two domains.
arXiv Detail & Related papers (2020-08-27T00:53:43Z) - Discriminative Active Learning for Domain Adaptation [16.004653151961303]
We introduce a discriminative active learning approach for domain adaptation to reduce the efforts of data annotation.
Specifically, we propose three-stage active adversarial training of neural networks.
Empirical comparisons with existing domain adaptation methods using four benchmark datasets demonstrate the effectiveness of the proposed approach.
arXiv Detail & Related papers (2020-05-24T04:20:49Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.