Federated Generalized Category Discovery
- URL: http://arxiv.org/abs/2305.14107v1
- Date: Tue, 23 May 2023 14:27:41 GMT
- Title: Federated Generalized Category Discovery
- Authors: Nan Pu and Zhun Zhong and Xinyuan Ji and Nicu Sebe
- Abstract summary: Generalized category discovery (GCD) aims at grouping unlabeled samples from known and unknown classes.
To meet the recent decentralization trend in the community, we introduce a practical yet challenging task, namely Federated GCD (Fed-GCD)
The goal of Fed-GCD is to train a generic GCD model by client collaboration under the privacy-protected constraint.
- Score: 68.35420359523329
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Generalized category discovery (GCD) aims at grouping unlabeled samples from
known and unknown classes, given labeled data of known classes. To meet the
recent decentralization trend in the community, we introduce a practical yet
challenging task, namely Federated GCD (Fed-GCD), where the training data are
distributively stored in local clients and cannot be shared among clients. The
goal of Fed-GCD is to train a generic GCD model by client collaboration under
the privacy-protected constraint. The Fed-GCD leads to two challenges: 1)
representation degradation caused by training each client model with fewer data
than centralized GCD learning, and 2) highly heterogeneous label spaces across
different clients. To this end, we propose a novel Associated Gaussian
Contrastive Learning (AGCL) framework based on learnable GMMs, which consists
of a Client Semantics Association (CSA) and a global-local GMM Contrastive
Learning (GCL). On the server, CSA aggregates the heterogeneous categories of
local-client GMMs to generate a global GMM containing more comprehensive
category knowledge. On each client, GCL builds class-level contrastive learning
with both local and global GMMs. The local GCL learns robust representation
with limited local data. The global GCL encourages the model to produce more
discriminative representation with the comprehensive category relationships
that may not exist in local data. We build a benchmark based on six visual
datasets to facilitate the study of Fed-GCD. Extensive experiments show that
our AGCL outperforms the FedAvg-based baseline on all datasets.
Related papers
- FedCCL: Federated Dual-Clustered Feature Contrast Under Domain Heterogeneity [43.71967577443732]
Federated learning (FL) facilitates a privacy-preserving neural network training paradigm through collaboration between edge clients and a central server.
Recent research is limited to simply using averaged signals as a form of regularization and only focusing on one aspect of these non-IID challenges.
We propose a dual-clustered feature contrast-based FL framework with dual focuses.
arXiv Detail & Related papers (2024-04-14T13:56:30Z) - GLC++: Source-Free Universal Domain Adaptation through Global-Local Clustering and Contrastive Affinity Learning [84.54244771470012]
Source-Free Universal Domain Adaptation (SF-UniDA) aims to accurately classify "known" data belonging to common categories.
We propose a novel Global and Local Clustering (GLC) technique, which comprises an adaptive one-vs-all global clustering algorithm.
We evolve GLC to GLC++, integrating a contrastive affinity learning strategy.
arXiv Detail & Related papers (2024-03-21T13:57:45Z) - Generalized Category Discovery with Clustering Assignment Consistency [56.92546133591019]
Generalized category discovery (GCD) is a recently proposed open-world task.
We propose a co-training-based framework that encourages clustering consistency.
Our method achieves state-of-the-art performance on three generic benchmarks and three fine-grained visual recognition datasets.
arXiv Detail & Related papers (2023-10-30T00:32:47Z) - Federated cINN Clustering for Accurate Clustered Federated Learning [33.72494731516968]
Federated Learning (FL) presents an innovative approach to privacy-preserving distributed machine learning.
We propose the Federated cINN Clustering Algorithm (FCCA) to robustly cluster clients into different groups.
arXiv Detail & Related papers (2023-09-04T10:47:52Z) - CLUSTSEG: Clustering for Universal Segmentation [56.58677563046506]
CLUSTSEG is a general, transformer-based framework for image segmentation.
It tackles different image segmentation tasks (i.e., superpixel, semantic, instance, and panoptic) through a unified neural clustering scheme.
arXiv Detail & Related papers (2023-05-03T15:31:16Z) - FedPNN: One-shot Federated Classification via Evolving Clustering Method
and Probabilistic Neural Network hybrid [4.241208172557663]
We propose a two-stage federated learning approach toward the objective of privacy protection.
In the first stage, the synthetic dataset is generated by employing two different distributions as noise.
In the second stage, the Federated Probabilistic Neural Network (FedPNN) is developed and employed for building globally shared classification model.
arXiv Detail & Related papers (2023-04-09T03:23:37Z) - Dynamic Conceptional Contrastive Learning for Generalized Category
Discovery [76.82327473338734]
Generalized category discovery (GCD) aims to automatically cluster partially labeled data.
Unlabeled data contain instances that are not only from known categories of the labeled data but also from novel categories.
One effective way for GCD is applying self-supervised learning to learn discriminate representation for unlabeled data.
We propose a Dynamic Conceptional Contrastive Learning framework, which can effectively improve clustering accuracy.
arXiv Detail & Related papers (2023-03-30T14:04:39Z) - You Never Cluster Alone [150.94921340034688]
We extend the mainstream contrastive learning paradigm to a cluster-level scheme, where all the data subjected to the same cluster contribute to a unified representation.
We define a set of categorical variables as clustering assignment confidence, which links the instance-level learning track with the cluster-level one.
By reparametrizing the assignment variables, TCC is trained end-to-end, requiring no alternating steps.
arXiv Detail & Related papers (2021-06-03T14:59:59Z) - Cluster-driven Graph Federated Learning over Multiple Domains [25.51716405561116]
Graph Federated Learning (FL) deals with learning a central model (i.e. the server) in privacy-constrained scenarios.
Here we propose a novel Cluster-driven Graph Federated Learning (FedCG)
arXiv Detail & Related papers (2021-04-29T19:31:19Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.