Relaxed Contrastive Learning for Federated Learning
- URL: http://arxiv.org/abs/2401.04928v2
- Date: Fri, 31 May 2024 08:23:42 GMT
- Title: Relaxed Contrastive Learning for Federated Learning
- Authors: Seonguk Seo, Jinkyu Kim, Geeho Kim, Bohyung Han,
- Abstract summary: We propose a novel contrastive learning framework to address the challenges of data heterogeneity in federated learning.
Our framework outperforms all existing federated learning approaches by huge margins on the standard benchmarks.
- Score: 48.96253206661268
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We propose a novel contrastive learning framework to effectively address the challenges of data heterogeneity in federated learning. We first analyze the inconsistency of gradient updates across clients during local training and establish its dependence on the distribution of feature representations, leading to the derivation of the supervised contrastive learning (SCL) objective to mitigate local deviations. In addition, we show that a na\"ive adoption of SCL in federated learning leads to representation collapse, resulting in slow convergence and limited performance gains. To address this issue, we introduce a relaxed contrastive learning loss that imposes a divergence penalty on excessively similar sample pairs within each class. This strategy prevents collapsed representations and enhances feature transferability, facilitating collaborative training and leading to significant performance improvements. Our framework outperforms all existing federated learning approaches by huge margins on the standard benchmarks through extensive experimental results.
Related papers
- The Curse of Diversity in Ensemble-Based Exploration [7.209197316045156]
Training a diverse ensemble of data-sharing agents can significantly impair the performance of the individual ensemble members.
We name this phenomenon the curse of diversity.
We demonstrate the potential of representation learning to counteract the curse of diversity.
arXiv Detail & Related papers (2024-05-07T14:14:50Z) - Controllable Relation Disentanglement for Few-Shot Class-Incremental Learning [82.79371269942146]
We propose to tackle FewShot Class-Incremental Learning (FSCIL) from a new perspective, i.e., relation disentanglement.
The challenge of disentangling spurious correlations lies in the poor controllability of FSCIL.
We propose a new simple-yeteffective method, called ConTrollable Relation-disentang FewShot Class-Incremental Learning (CTRL-FSCIL)
arXiv Detail & Related papers (2024-03-17T03:16:59Z) - ArCL: Enhancing Contrastive Learning with Augmentation-Robust
Representations [30.745749133759304]
We develop a theoretical framework to analyze the transferability of self-supervised contrastive learning.
We show that contrastive learning fails to learn domain-invariant features, which limits its transferability.
Based on these theoretical insights, we propose a novel method called Augmentation-robust Contrastive Learning (ArCL)
arXiv Detail & Related papers (2023-03-02T09:26:20Z) - Combating Exacerbated Heterogeneity for Robust Models in Federated
Learning [91.88122934924435]
Combination of adversarial training and federated learning can lead to the undesired robustness deterioration.
We propose a novel framework called Slack Federated Adversarial Training (SFAT)
We verify the rationality and effectiveness of SFAT on various benchmarked and real-world datasets.
arXiv Detail & Related papers (2023-03-01T06:16:15Z) - Unbiased and Efficient Self-Supervised Incremental Contrastive Learning [31.763904668737304]
We propose a self-supervised Incremental Contrastive Learning (ICL) framework consisting of a novel Incremental InfoNCE (NCE-II) loss function.
ICL achieves up to 16.7x training speedup and 16.8x faster convergence with competitive results.
arXiv Detail & Related papers (2023-01-28T06:11:31Z) - When Do Curricula Work in Federated Learning? [56.88941905240137]
We find that curriculum learning largely alleviates non-IIDness.
The more disparate the data distributions across clients the more they benefit from learning.
We propose a novel client selection technique that benefits from the real-world disparity in the clients.
arXiv Detail & Related papers (2022-12-24T11:02:35Z) - Semi-Discriminative Representation Loss for Online Continual Learning [16.414031859647874]
gradient-based approaches have been developed to make more efficient use of compact episodic memory.
We propose a simple method -- Semi-Discriminative Representation Loss (SDRL) -- for continual learning.
arXiv Detail & Related papers (2020-06-19T17:13:42Z) - Learning Diverse Representations for Fast Adaptation to Distribution
Shift [78.83747601814669]
We present a method for learning multiple models, incorporating an objective that pressures each to learn a distinct way to solve the task.
We demonstrate our framework's ability to facilitate rapid adaptation to distribution shift.
arXiv Detail & Related papers (2020-06-12T12:23:50Z) - Federated Residual Learning [53.77128418049985]
We study a new form of federated learning where the clients train personalized local models and make predictions jointly with the server-side shared model.
Using this new federated learning framework, the complexity of the central shared model can be minimized while still gaining all the performance benefits that joint training provides.
arXiv Detail & Related papers (2020-03-28T19:55:24Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.