Cluster-driven Graph Federated Learning over Multiple Domains
- URL: http://arxiv.org/abs/2104.14628v1
- Date: Thu, 29 Apr 2021 19:31:19 GMT
- Title: Cluster-driven Graph Federated Learning over Multiple Domains
- Authors: Debora Caldarola, Massimiliano Mancini, Fabio Galasso, Marco Ciccone,
Emanuele Rodol\`a, Barbara Caputo
- Abstract summary: Graph Federated Learning (FL) deals with learning a central model (i.e. the server) in privacy-constrained scenarios.
Here we propose a novel Cluster-driven Graph Federated Learning (FedCG)
- Score: 25.51716405561116
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Federated Learning (FL) deals with learning a central model (i.e. the server)
in privacy-constrained scenarios, where data are stored on multiple devices
(i.e. the clients). The central model has no direct access to the data, but
only to the updates of the parameters computed locally by each client. This
raises a problem, known as statistical heterogeneity, because the clients may
have different data distributions (i.e. domains). This is only partly
alleviated by clustering the clients. Clustering may reduce heterogeneity by
identifying the domains, but it deprives each cluster model of the data and
supervision of others. Here we propose a novel Cluster-driven Graph Federated
Learning (FedCG). In FedCG, clustering serves to address statistical
heterogeneity, while Graph Convolutional Networks (GCNs) enable sharing
knowledge across them. FedCG: i) identifies the domains via an FL-compliant
clustering and instantiates domain-specific modules (residual branches) for
each domain; ii) connects the domain-specific modules through a GCN at training
to learn the interactions among domains and share knowledge; and iii) learns to
cluster unsupervised via teacher-student classifier-training iterations and to
address novel unseen test domains via their domain soft-assignment scores.
Thanks to the unique interplay of GCN over clusters, FedCG achieves the
state-of-the-art on multiple FL benchmarks.
Related papers
- Federated Clustering: An Unsupervised Cluster-Wise Training for Decentralized Data Distributions [1.6385815610837167]
Federated Cluster-Wise Refinement (FedCRef) involves clients that collaboratively train models on clusters with similar data distributions.
In these groups, clients collaboratively train a shared model representing each data distribution, while continuously refining their local clusters to enhance data association accuracy.
This iterative process allows our system to identify all potential data distributions across the network and develop robust representation models for each.
arXiv Detail & Related papers (2024-08-20T09:05:44Z) - Federated Graph Learning with Structure Proxy Alignment [43.13100155569234]
Federated Graph Learning (FGL) aims to learn graph learning models over graph data distributed in multiple data owners.
We propose FedSpray, a novel FGL framework that learns local class-wise structure proxies in the latent space.
Our goal is to obtain the aligned structure proxies that can serve as reliable, unbiased neighboring information for node classification.
arXiv Detail & Related papers (2024-08-18T07:32:54Z) - FedCCL: Federated Dual-Clustered Feature Contrast Under Domain Heterogeneity [43.71967577443732]
Federated learning (FL) facilitates a privacy-preserving neural network training paradigm through collaboration between edge clients and a central server.
Recent research is limited to simply using averaged signals as a form of regularization and only focusing on one aspect of these non-IID challenges.
We propose a dual-clustered feature contrast-based FL framework with dual focuses.
arXiv Detail & Related papers (2024-04-14T13:56:30Z) - Federated Generalized Category Discovery [68.35420359523329]
Generalized category discovery (GCD) aims at grouping unlabeled samples from known and unknown classes.
To meet the recent decentralization trend in the community, we introduce a practical yet challenging task, namely Federated GCD (Fed-GCD)
The goal of Fed-GCD is to train a generic GCD model by client collaboration under the privacy-protected constraint.
arXiv Detail & Related papers (2023-05-23T14:27:41Z) - Upcycling Models under Domain and Category Shift [95.22147885947732]
We introduce an innovative global and local clustering learning technique (GLC)
We design a novel, adaptive one-vs-all global clustering algorithm to achieve the distinction across different target classes.
Remarkably, in the most challenging open-partial-set DA scenario, GLC outperforms UMAD by 14.8% on the VisDA benchmark.
arXiv Detail & Related papers (2023-03-13T13:44:04Z) - Efficient Distribution Similarity Identification in Clustered Federated
Learning via Principal Angles Between Client Data Subspaces [59.33965805898736]
Clustered learning has been shown to produce promising results by grouping clients into clusters.
Existing FL algorithms are essentially trying to group clients together with similar distributions.
Prior FL algorithms attempt similarities indirectly during training.
arXiv Detail & Related papers (2022-09-21T17:37:54Z) - FedILC: Weighted Geometric Mean and Invariant Gradient Covariance for
Federated Learning on Non-IID Data [69.0785021613868]
Federated learning is a distributed machine learning approach which enables a shared server model to learn by aggregating the locally-computed parameter updates with the training data from spatially-distributed client silos.
We propose the Federated Invariant Learning Consistency (FedILC) approach, which leverages the gradient covariance and the geometric mean of Hessians to capture both inter-silo and intra-silo consistencies.
This is relevant to various fields such as medical healthcare, computer vision, and the Internet of Things (IoT)
arXiv Detail & Related papers (2022-05-19T03:32:03Z) - Federated Learning with Domain Generalization [11.92860245410696]
Federated Learning enables a group of clients to jointly train a machine learning model with the help of a centralized server.
In practice, the model trained over multiple source domains may have poor generalization performance on unseen target domains.
We propose FedADG to equip federated learning with domain generalization capability.
arXiv Detail & Related papers (2021-11-20T01:02:36Z) - Cross-domain Contrastive Learning for Unsupervised Domain Adaptation [108.63914324182984]
Unsupervised domain adaptation (UDA) aims to transfer knowledge learned from a fully-labeled source domain to a different unlabeled target domain.
We build upon contrastive self-supervised learning to align features so as to reduce the domain discrepancy between training and testing sets.
arXiv Detail & Related papers (2021-06-10T06:32:30Z) - Exploring Category-Agnostic Clusters for Open-Set Domain Adaptation [138.29273453811945]
We present Self-Ensembling with Category-agnostic Clusters (SE-CC) -- a novel architecture that steers domain adaptation with category-agnostic clusters in target domain.
clustering is performed over all the unlabeled target samples to obtain the category-agnostic clusters, which reveal the underlying data space structure peculiar to target domain.
arXiv Detail & Related papers (2020-06-11T16:19:02Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.