FedLoGe: Joint Local and Generic Federated Learning under Long-tailed
Data
- URL: http://arxiv.org/abs/2401.08977v2
- Date: Fri, 8 Mar 2024 13:37:55 GMT
- Title: FedLoGe: Joint Local and Generic Federated Learning under Long-tailed
Data
- Authors: Zikai Xiao, Zihan Chen, Liyinglan Liu, Yang Feng, Jian Wu, Wanlu Liu,
Joey Tianyi Zhou, Howard Hao Yang, Zuozhu Liu
- Abstract summary: Federated Long-Tailed Learning (Fed-LT) is a paradigm wherein data collected from decentralized local clients manifests a globally prevalent long-tailed distribution.
This paper introduces an approach termed Federated Local and Generic Model Training in Fed-LT (FedLoGe), which enhances both local and generic model performance.
- Score: 46.29190753993415
- License: http://creativecommons.org/publicdomain/zero/1.0/
- Abstract: Federated Long-Tailed Learning (Fed-LT), a paradigm wherein data collected
from decentralized local clients manifests a globally prevalent long-tailed
distribution, has garnered considerable attention in recent times. In the
context of Fed-LT, existing works have predominantly centered on addressing the
data imbalance issue to enhance the efficacy of the generic global model while
neglecting the performance at the local level. In contrast, conventional
Personalized Federated Learning (pFL) techniques are primarily devised to
optimize personalized local models under the presumption of a balanced global
data distribution. This paper introduces an approach termed Federated Local and
Generic Model Training in Fed-LT (FedLoGe), which enhances both local and
generic model performance through the integration of representation learning
and classifier alignment within a neural collapse framework. Our investigation
reveals the feasibility of employing a shared backbone as a foundational
framework for capturing overarching global trends, while concurrently employing
individualized classifiers to encapsulate distinct refinements stemming from
each client's local features. Building upon this discovery, we establish the
Static Sparse Equiangular Tight Frame Classifier (SSE-C), inspired by neural
collapse principles that naturally prune extraneous noisy features and foster
the acquisition of potent data representations. Furthermore, leveraging
insights from imbalance neural collapse's classifier norm patterns, we develop
Global and Local Adaptive Feature Realignment (GLA-FR) via an auxiliary global
classifier and personalized Euclidean norm transfer to align global features
with client preferences. Extensive experimental results on CIFAR-10/100-LT,
ImageNet, and iNaturalist demonstrate the advantage of our method over
state-of-the-art pFL and Fed-LT approaches.
Related papers
Err
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.