Federated Continual Novel Class Learning
- URL: http://arxiv.org/abs/2312.13500v1
- Date: Thu, 21 Dec 2023 00:31:54 GMT
- Title: Federated Continual Novel Class Learning
- Authors: Lixu Wang, Chenxi Liu, Junfeng Guo, Jiahua Dong, Xiao Wang, Heng
Huang, Qi Zhu
- Abstract summary: We propose a Global Alignment Learning framework that can accurately estimate the global novel class number.
Gal achieves significant improvements in novel-class performance, increasing the accuracy by 5.1% to 10.6%.
Gal is shown to be effective in equipping a variety of different mainstream Federated Learning algorithms with novel class discovery and learning capability.
- Score: 68.05835753892907
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In a privacy-focused era, Federated Learning (FL) has emerged as a promising
machine learning technique. However, most existing FL studies assume that the
data distribution remains nearly fixed over time, while real-world scenarios
often involve dynamic and continual changes. To equip FL systems with continual
model evolution capabilities, we focus on an important problem called Federated
Continual Novel Class Learning (FedCN) in this work. The biggest challenge in
FedCN is to merge and align novel classes that are discovered and learned by
different clients without compromising privacy. To address this, we propose a
Global Alignment Learning (GAL) framework that can accurately estimate the
global novel class number and provide effective guidance for local training
from a global perspective, all while maintaining privacy protection.
Specifically, GAL first locates high-density regions in the representation
space through a bi-level clustering mechanism to estimate the novel class
number, with which the global prototypes corresponding to novel classes can be
constructed. Then, GAL uses a novel semantic weighted loss to capture all
possible correlations between these prototypes and the training data for
mitigating the impact of pseudo-label noise and data heterogeneity. Extensive
experiments on various datasets demonstrate GAL's superior performance over
state-of-the-art novel class discovery methods. In particular, GAL achieves
significant improvements in novel-class performance, increasing the accuracy by
5.1% to 10.6% in the case of one novel class learning stage and by 7.8% to
17.9% in the case of two novel class learning stages, without sacrificing
known-class performance. Moreover, GAL is shown to be effective in equipping a
variety of different mainstream FL algorithms with novel class discovery and
learning capability, highlighting its potential for many real-world
applications.
Related papers
- Adaptive Global-Local Representation Learning and Selection for
Cross-Domain Facial Expression Recognition [54.334773598942775]
Domain shift poses a significant challenge in Cross-Domain Facial Expression Recognition (CD-FER)
We propose an Adaptive Global-Local Representation Learning and Selection framework.
arXiv Detail & Related papers (2024-01-20T02:21:41Z) - Generalizable Heterogeneous Federated Cross-Correlation and Instance
Similarity Learning [60.058083574671834]
This paper presents a novel FCCL+, federated correlation and similarity learning with non-target distillation.
For heterogeneous issue, we leverage irrelevant unlabeled public data for communication.
For catastrophic forgetting in local updating stage, FCCL+ introduces Federated Non Target Distillation.
arXiv Detail & Related papers (2023-09-28T09:32:27Z) - Rethinking Client Drift in Federated Learning: A Logit Perspective [125.35844582366441]
Federated Learning (FL) enables multiple clients to collaboratively learn in a distributed way, allowing for privacy protection.
We find that the difference in logits between the local and global models increases as the model is continuously updated.
We propose a new algorithm, named FedCSD, a Class prototype Similarity Distillation in a federated framework to align the local and global models.
arXiv Detail & Related papers (2023-08-20T04:41:01Z) - Class-relation Knowledge Distillation for Novel Class Discovery [16.461242381109276]
Key challenge lies in transferring the knowledge in the known-class data to the learning of novel classes.
We introduce a class relation representation for the novel classes based on the predicted class distribution of a model trained on known classes.
We propose a novel knowledge distillation framework, which utilizes our class-relation representation to regularize the learning of novel classes.
arXiv Detail & Related papers (2023-07-18T11:35:57Z) - Towards Unbiased Training in Federated Open-world Semi-supervised
Learning [15.08153616709326]
We propose a novel Federatedopen-world Semi-Supervised Learning (FedoSSL) framework, which can solve the key challenge in distributed and open-world settings.
We adopt an uncertainty-aware suppressed loss to alleviate the biased training between locally unseen and globally unseen classes.
The proposed FedoSSL can be easily adapted to state-of-the-art FL methods, which is also validated via extensive experiments on benchmarks and real-world datasets.
arXiv Detail & Related papers (2023-05-01T11:12:37Z) - FedRC: Tackling Diverse Distribution Shifts Challenge in Federated Learning by Robust Clustering [4.489171618387544]
Federated Learning (FL) is a machine learning paradigm that safeguards privacy by retaining client data on edge devices.
In this paper, we identify the learning challenges posed by the simultaneous occurrence of diverse distribution shifts.
We propose a novel clustering algorithm framework, dubbed as FedRC, which adheres to our proposed clustering principle.
arXiv Detail & Related papers (2023-01-29T06:50:45Z) - Integrating Local Real Data with Global Gradient Prototypes for
Classifier Re-Balancing in Federated Long-Tailed Learning [60.41501515192088]
Federated Learning (FL) has become a popular distributed learning paradigm that involves multiple clients training a global model collaboratively.
The data samples usually follow a long-tailed distribution in the real world, and FL on the decentralized and long-tailed data yields a poorly-behaved global model.
In this work, we integrate the local real data with the global gradient prototypes to form the local balanced datasets.
arXiv Detail & Related papers (2023-01-25T03:18:10Z) - Multi-Level Branched Regularization for Federated Learning [46.771459325434535]
We propose a novel architectural regularization technique that constructs multiple auxiliary branches in each local model by grafting local and globalworks at several different levels.
We demonstrate remarkable performance gains in terms of accuracy and efficiency compared to existing methods.
arXiv Detail & Related papers (2022-07-14T13:59:26Z) - Multi-Source Domain Adaptation Based on Federated Knowledge Alignment [0.0]
Federated Learning (FL) facilitates distributed model learning to protect users' privacy.
We propose Federated Knowledge Alignment (FedKA) that aligns features from different clients and those of the target task.
arXiv Detail & Related papers (2022-03-22T11:42:25Z) - Novel Class Discovery in Semantic Segmentation [104.30729847367104]
We introduce a new setting of Novel Class Discovery in Semantic (NCDSS)
It aims at segmenting unlabeled images containing new classes given prior knowledge from a labeled set of disjoint classes.
In NCDSS, we need to distinguish the objects and background, and to handle the existence of multiple classes within an image.
We propose the Entropy-based Uncertainty Modeling and Self-training (EUMS) framework to overcome noisy pseudo-labels.
arXiv Detail & Related papers (2021-12-03T13:31:59Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.