Federated Learning with Only Positive Labels
- URL: http://arxiv.org/abs/2004.10342v1
- Date: Tue, 21 Apr 2020 23:35:02 GMT
- Title: Federated Learning with Only Positive Labels
- Authors: Felix X. Yu, Ankit Singh Rawat, Aditya Krishna Menon, Sanjiv Kumar
- Abstract summary: We propose a generic framework for training with only positive labels, namely Federated Averaging with Spreadout (FedAwS)
We show, both theoretically and empirically, that FedAwS can almost match the performance of conventional learning where users have access to negative labels.
- Score: 71.63836379169315
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We consider learning a multi-class classification model in the federated
setting, where each user has access to the positive data associated with only a
single class. As a result, during each federated learning round, the users need
to locally update the classifier without having access to the features and the
model parameters for the negative classes. Thus, naively employing conventional
decentralized learning such as the distributed SGD or Federated Averaging may
lead to trivial or extremely poor classifiers. In particular, for the embedding
based classifiers, all the class embeddings might collapse to a single point.
To address this problem, we propose a generic framework for training with
only positive labels, namely Federated Averaging with Spreadout (FedAwS), where
the server imposes a geometric regularizer after each round to encourage
classes to be spreadout in the embedding space. We show, both theoretically and
empirically, that FedAwS can almost match the performance of conventional
learning where users have access to negative labels. We further extend the
proposed method to the settings with large output spaces.
Related papers
- Embedding Space Allocation with Angle-Norm Joint Classifiers for Few-Shot Class-Incremental Learning [8.321592316231786]
Few-shot class-incremental learning aims to continually learn new classes from only a few samples.
Current classes occupy the entire feature space, which is detrimental to learning new classes.
Small number of samples in incremental rounds is insufficient for fully training.
arXiv Detail & Related papers (2024-11-14T07:31:12Z) - Federated Learning with Only Positive Labels by Exploring Label Correlations [78.59613150221597]
Federated learning aims to collaboratively learn a model by using the data from multiple users under privacy constraints.
In this paper, we study the multi-label classification problem under the federated learning setting.
We propose a novel and generic method termed Federated Averaging by exploring Label Correlations (FedALC)
arXiv Detail & Related papers (2024-04-24T02:22:50Z) - Exploring Vacant Classes in Label-Skewed Federated Learning [113.65301899666645]
Label skews, characterized by disparities in local label distribution across clients, pose a significant challenge in federated learning.
This paper introduces FedVLS, a novel approach to label-skewed federated learning that integrates vacant-class distillation and logit suppression simultaneously.
arXiv Detail & Related papers (2024-01-04T16:06:31Z) - Evolving Multi-Label Fuzzy Classifier [5.53329677986653]
Multi-label classification has attracted much attention in the machine learning community to address the problem of assigning single samples to more than one class at the same time.
We propose an evolving multi-label fuzzy classifier (EFC-ML) which is able to self-adapt and self-evolve its structure with new incoming multi-label samples in an incremental, single-pass manner.
arXiv Detail & Related papers (2022-03-29T08:01:03Z) - Towards Unbiased Multi-label Zero-Shot Learning with Pyramid and
Semantic Attention [14.855116554722489]
Multi-label zero-shot learning aims at recognizing multiple unseen labels of classes for each input sample.
We propose a novel framework of unbiased multi-label zero-shot learning, by considering various class-specific regions.
arXiv Detail & Related papers (2022-03-07T15:52:46Z) - Prototypical Classifier for Robust Class-Imbalanced Learning [64.96088324684683]
We propose textitPrototypical, which does not require fitting additional parameters given the embedding network.
Prototypical produces balanced and comparable predictions for all classes even though the training set is class-imbalanced.
We test our method on CIFAR-10LT, CIFAR-100LT and Webvision datasets, observing that Prototypical obtains substaintial improvements compared with state of the arts.
arXiv Detail & Related papers (2021-10-22T01:55:01Z) - PLM: Partial Label Masking for Imbalanced Multi-label Classification [59.68444804243782]
Neural networks trained on real-world datasets with long-tailed label distributions are biased towards frequent classes and perform poorly on infrequent classes.
We propose a method, Partial Label Masking (PLM), which utilizes this ratio during training.
Our method achieves strong performance when compared to existing methods on both multi-label (MultiMNIST and MSCOCO) and single-label (imbalanced CIFAR-10 and CIFAR-100) image classification datasets.
arXiv Detail & Related papers (2021-05-22T18:07:56Z) - CLASTER: Clustering with Reinforcement Learning for Zero-Shot Action
Recognition [52.66360172784038]
We propose a clustering-based model, which considers all training samples at once, instead of optimizing for each instance individually.
We call the proposed method CLASTER and observe that it consistently improves over the state-of-the-art in all standard datasets.
arXiv Detail & Related papers (2021-01-18T12:46:24Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.