A Prototype-Oriented Clustering for Domain Shift with Source Privacy
- URL: http://arxiv.org/abs/2302.03807v2
- Date: Thu, 9 Feb 2023 07:36:04 GMT
- Title: A Prototype-Oriented Clustering for Domain Shift with Source Privacy
- Authors: Korawat Tanwisuth, Shujian Zhang, Pengcheng He, Mingyuan Zhou
- Abstract summary: We introduce Prototype-oriented Clustering with Distillation (PCD) to improve the performance and applicability of existing methods.
PCD first constructs a source clustering model by aligning the distributions of prototypes and data.
It then distills the knowledge to the target model through cluster labels provided by the source model while simultaneously clustering the target data.
- Score: 66.67700676888629
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Unsupervised clustering under domain shift (UCDS) studies how to transfer the
knowledge from abundant unlabeled data from multiple source domains to learn
the representation of the unlabeled data in a target domain. In this paper, we
introduce Prototype-oriented Clustering with Distillation (PCD) to not only
improve the performance and applicability of existing methods for UCDS, but
also address the concerns on protecting the privacy of both the data and model
of the source domains. PCD first constructs a source clustering model by
aligning the distributions of prototypes and data. It then distills the
knowledge to the target model through cluster labels provided by the source
model while simultaneously clustering the target data. Finally, it refines the
target model on the target domain data without guidance from the source model.
Experiments across multiple benchmarks show the effectiveness and
generalizability of our source-private clustering method.
Related papers
- ProtoGMM: Multi-prototype Gaussian-Mixture-based Domain Adaptation Model for Semantic Segmentation [0.8213829427624407]
Domain adaptive semantic segmentation aims to generate accurate and dense predictions for an unlabeled target domain.
We propose the ProtoGMM model, which incorporates the GMM into contrastive losses to perform guided contrastive learning.
To achieve increased intra-class semantic similarity, decreased inter-class similarity, and domain alignment between the source and target domains, we employ multi-prototype contrastive learning.
arXiv Detail & Related papers (2024-06-27T14:50:50Z) - Generative Model Based Noise Robust Training for Unsupervised Domain
Adaptation [108.11783463263328]
This paper proposes a Generative model-based Noise-Robust Training method (GeNRT)
It eliminates domain shift while mitigating label noise.
Experiments on Office-Home, PACS, and Digit-Five show that our GeNRT achieves comparable performance to state-of-the-art methods.
arXiv Detail & Related papers (2023-03-10T06:43:55Z) - Unsupervised Domain Adaptation via Distilled Discriminative Clustering [45.39542287480395]
We re-cast the domain adaptation problem as discriminative clustering of target data.
We propose to jointly train the network using parallel, supervised learning objectives over labeled source data.
We conduct careful ablation studies and extensive experiments on five popular benchmark datasets.
arXiv Detail & Related papers (2023-02-23T13:03:48Z) - Polycentric Clustering and Structural Regularization for Source-free
Unsupervised Domain Adaptation [20.952542421577487]
Source-Free Domain Adaptation (SFDA) aims to solve the domain adaptation problem by transferring the knowledge learned from a pre-trained source model to an unseen target domain.
Most existing methods assign pseudo-labels to the target data by generating feature prototypes.
In this paper, a novel framework named PCSR is proposed to tackle SFDA via a novel intra-class Polycentric Clustering and Structural Regularization strategy.
arXiv Detail & Related papers (2022-10-14T02:20:48Z) - Source-Free Domain Adaptation via Distribution Estimation [106.48277721860036]
Domain Adaptation aims to transfer the knowledge learned from a labeled source domain to an unlabeled target domain whose data distributions are different.
Recently, Source-Free Domain Adaptation (SFDA) has drawn much attention, which tries to tackle domain adaptation problem without using source data.
In this work, we propose a novel framework called SFDA-DE to address SFDA task via source Distribution Estimation.
arXiv Detail & Related papers (2022-04-24T12:22:19Z) - BMD: A General Class-balanced Multicentric Dynamic Prototype Strategy
for Source-free Domain Adaptation [74.93176783541332]
Source-free Domain Adaptation (SFDA) aims to adapt a pre-trained source model to the unlabeled target domain without accessing the well-labeled source data.
To make up for the absence of source data, most existing methods introduced feature prototype based pseudo-labeling strategies.
We propose a general class-Balanced Multicentric Dynamic prototype strategy for the SFDA task.
arXiv Detail & Related papers (2022-04-06T13:23:02Z) - Inferring Latent Domains for Unsupervised Deep Domain Adaptation [54.963823285456925]
Unsupervised Domain Adaptation (UDA) refers to the problem of learning a model in a target domain where labeled data are not available.
This paper introduces a novel deep architecture which addresses the problem of UDA by automatically discovering latent domains in visual datasets.
We evaluate our approach on publicly available benchmarks, showing that it outperforms state-of-the-art domain adaptation methods.
arXiv Detail & Related papers (2021-03-25T14:33:33Z) - Source Data-absent Unsupervised Domain Adaptation through Hypothesis
Transfer and Labeling Transfer [137.36099660616975]
Unsupervised adaptation adaptation (UDA) aims to transfer knowledge from a related but different well-labeled source domain to a new unlabeled target domain.
Most existing UDA methods require access to the source data, and thus are not applicable when the data are confidential and not shareable due to privacy concerns.
This paper aims to tackle a realistic setting with only a classification model available trained over, instead of accessing to the source data.
arXiv Detail & Related papers (2020-12-14T07:28:50Z) - Learning to Cluster under Domain Shift [20.00056591000625]
In this work we address the problem of transferring knowledge from a source to a target domain when both source and target data have no annotations.
Inspired by recent works on deep clustering, our approach leverages information from data gathered from multiple source domains.
We show that our method is able to automatically discover relevant semantic information even in presence of few target samples.
arXiv Detail & Related papers (2020-08-11T12:03:01Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.