Taming Cross-Domain Representation Variance in Federated Prototype Learning with Heterogeneous Data Domains
- URL: http://arxiv.org/abs/2403.09048v2
- Date: Sun, 27 Oct 2024 23:21:07 GMT
- Title: Taming Cross-Domain Representation Variance in Federated Prototype Learning with Heterogeneous Data Domains
- Authors: Lei Wang, Jieming Bian, Letian Zhang, Chen Chen, Jie Xu,
- Abstract summary: Federated learning (FL) allows collaborative machine learning training without sharing private data.
While most FL methods assume identical data domains across clients, real-world scenarios often involve heterogeneous data domains.
We introduce FedPLVM, which establishes variance-aware dual-level prototypes clustering and employs a novel $alpha$-sparsity prototype loss.
- Score: 8.047147770476212
- License:
- Abstract: Federated learning (FL) allows collaborative machine learning training without sharing private data. While most FL methods assume identical data domains across clients, real-world scenarios often involve heterogeneous data domains. Federated Prototype Learning (FedPL) addresses this issue, using mean feature vectors as prototypes to enhance model generalization. However, existing FedPL methods create the same number of prototypes for each client, leading to cross-domain performance gaps and disparities for clients with varied data distributions. To mitigate cross-domain feature representation variance, we introduce FedPLVM, which establishes variance-aware dual-level prototypes clustering and employs a novel $\alpha$-sparsity prototype loss. The dual-level prototypes clustering strategy creates local clustered prototypes based on private data features, then performs global prototypes clustering to reduce communication complexity and preserve local data privacy. The $\alpha$-sparsity prototype loss aligns samples from underrepresented domains, enhancing intra-class similarity and reducing inter-class similarity. Evaluations on Digit-5, Office-10, and DomainNet datasets demonstrate our method's superiority over existing approaches.
Related papers
- Deep Domain Isolation and Sample Clustered Federated Learning for Semantic Segmentation [2.515027627030043]
In this paper, we explore for the first time the effect of covariate shifts between participants' data in 2D segmentation tasks.
We develop Deep Domain Isolation (DDI) to isolate image domains directly in the gradient space of the model.
We leverage this clustering algorithm through a Sample Clustered Federated Learning (SCFL) framework.
arXiv Detail & Related papers (2024-10-04T12:43:07Z) - An Enhanced Federated Prototype Learning Method under Domain Shift [36.73020712815063]
Federated Learning (FL) allows collaborative machine learning training without sharing private data.
Recent paper introduces variance-aware dual-level prototype clustering and uses a novel $alpha$-sparsity prototype loss.
Evaluations on the Digit-5, Office-10, and DomainNet datasets show that our method performs better than existing approaches.
arXiv Detail & Related papers (2024-09-27T09:28:27Z) - Semi-supervised Domain Adaptation via Prototype-based Multi-level
Learning [4.232614032390374]
In semi-supervised domain adaptation (SSDA), a few labeled target samples of each class help the model to transfer knowledge representation from the fully labeled source domain to the target domain.
We propose a Prototype-based Multi-level Learning (ProML) framework to better tap the potential of labeled target samples.
arXiv Detail & Related papers (2023-05-04T10:09:30Z) - Prototype Helps Federated Learning: Towards Faster Convergence [38.517903009319994]
Federated learning (FL) is a distributed machine learning technique in which multiple clients cooperate to train a shared model without exchanging their raw data.
In this paper, a prototype-based federated learning framework is proposed, which can achieve better inference performance with only a few changes to the last global iteration of the typical federated learning process.
arXiv Detail & Related papers (2023-03-22T04:06:29Z) - A Prototype-Oriented Clustering for Domain Shift with Source Privacy [66.67700676888629]
We introduce Prototype-oriented Clustering with Distillation (PCD) to improve the performance and applicability of existing methods.
PCD first constructs a source clustering model by aligning the distributions of prototypes and data.
It then distills the knowledge to the target model through cluster labels provided by the source model while simultaneously clustering the target data.
arXiv Detail & Related papers (2023-02-08T00:15:35Z) - Synthetic-to-Real Domain Generalized Semantic Segmentation for 3D Indoor
Point Clouds [69.64240235315864]
This paper introduces the synthetic-to-real domain generalization setting to this task.
The domain gap between synthetic and real-world point cloud data mainly lies in the different layouts and point patterns.
Experiments on the synthetic-to-real benchmark demonstrate that both CINMix and multi-prototypes can narrow the distribution gap.
arXiv Detail & Related papers (2022-12-09T05:07:43Z) - Polycentric Clustering and Structural Regularization for Source-free
Unsupervised Domain Adaptation [20.952542421577487]
Source-Free Domain Adaptation (SFDA) aims to solve the domain adaptation problem by transferring the knowledge learned from a pre-trained source model to an unseen target domain.
Most existing methods assign pseudo-labels to the target data by generating feature prototypes.
In this paper, a novel framework named PCSR is proposed to tackle SFDA via a novel intra-class Polycentric Clustering and Structural Regularization strategy.
arXiv Detail & Related papers (2022-10-14T02:20:48Z) - BMD: A General Class-balanced Multicentric Dynamic Prototype Strategy
for Source-free Domain Adaptation [74.93176783541332]
Source-free Domain Adaptation (SFDA) aims to adapt a pre-trained source model to the unlabeled target domain without accessing the well-labeled source data.
To make up for the absence of source data, most existing methods introduced feature prototype based pseudo-labeling strategies.
We propose a general class-Balanced Multicentric Dynamic prototype strategy for the SFDA task.
arXiv Detail & Related papers (2022-04-06T13:23:02Z) - Federated and Generalized Person Re-identification through Domain and
Feature Hallucinating [88.77196261300699]
We study the problem of federated domain generalization (FedDG) for person re-identification (re-ID)
We propose a novel method, called "Domain and Feature Hallucinating (DFH)", to produce diverse features for learning generalized local and global models.
Our method achieves the state-of-the-art performance for FedDG on four large-scale re-ID benchmarks.
arXiv Detail & Related papers (2022-03-05T09:15:13Z) - Attentive Prototypes for Source-free Unsupervised Domain Adaptive 3D
Object Detection [85.11649974840758]
3D object detection networks tend to be biased towards the data they are trained on.
We propose a single-frame approach for source-free, unsupervised domain adaptation of lidar-based 3D object detectors.
arXiv Detail & Related papers (2021-11-30T18:42:42Z) - Cross-domain Detection via Graph-induced Prototype Alignment [114.8952035552862]
We propose a Graph-induced Prototype Alignment (GPA) framework to seek for category-level domain alignment.
In addition, in order to alleviate the negative effect of class-imbalance on domain adaptation, we design a Class-reweighted Contrastive Loss.
Our approach outperforms existing methods with a remarkable margin.
arXiv Detail & Related papers (2020-03-28T17:46:55Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.