Addressing Class Variable Imbalance in Federated Semi-supervised
Learning
- URL: http://arxiv.org/abs/2303.11809v1
- Date: Tue, 21 Mar 2023 12:50:17 GMT
- Title: Addressing Class Variable Imbalance in Federated Semi-supervised
Learning
- Authors: Zehui Dong, Wenjing Liu, Siyuan Liu, Xingzhi Chen
- Abstract summary: We propose Federated Semi-supervised Learning for Class Variable Imbalance (FCVI) to solve class variable imbalance.
FCVI is used to mitigate the data imbalance due to changes of the number of classes.
Our scheme is proved to be significantly better than baseline methods, while maintaining client privacy.
- Score: 10.542178602467885
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Federated Semi-supervised Learning (FSSL) combines techniques from both
fields of federated and semi-supervised learning to improve the accuracy and
performance of models in a distributed environment by using a small fraction of
labeled data and a large amount of unlabeled data. Without the need to
centralize all data in one place for training, it collect updates of model
training after devices train models at local, and thus can protect the privacy
of user data. However, during the federal training process, some of the devices
fail to collect enough data for local training, while new devices will be
included to the group training. This leads to an unbalanced global data
distribution and thus affect the performance of the global model training. Most
of the current research is focusing on class imbalance with a fixed number of
classes, while little attention is paid to data imbalance with a variable
number of classes. Therefore, in this paper, we propose Federated
Semi-supervised Learning for Class Variable Imbalance (FCVI) to solve class
variable imbalance. The class-variable learning algorithm is used to mitigate
the data imbalance due to changes of the number of classes. Our scheme is
proved to be significantly better than baseline methods, while maintaining
client privacy.
Related papers
- On Homomorphic Encryption Based Strategies for Class Imbalance in Federated Learning [4.322339935902437]
Class imbalance in training datasets can lead to bias and poor generalization in machine learning models.
We propose FLICKER, a privacy preserving framework to address issues related to global class imbalance in federated learning.
arXiv Detail & Related papers (2024-10-28T16:35:40Z) - Exploring Vacant Classes in Label-Skewed Federated Learning [113.65301899666645]
Label skews, characterized by disparities in local label distribution across clients, pose a significant challenge in federated learning.
This paper introduces FedVLS, a novel approach to label-skewed federated learning that integrates vacant-class distillation and logit suppression simultaneously.
arXiv Detail & Related papers (2024-01-04T16:06:31Z) - A Survey on Class Imbalance in Federated Learning [6.632451878730774]
Federated learning allows multiple client devices in a network to jointly train a machine learning model without direct exposure of clients' data.
It has been found that models trained with federated learning usually have worse performance than their counterparts trained in the standard centralized learning mode.
arXiv Detail & Related papers (2023-03-21T08:34:23Z) - Integrating Local Real Data with Global Gradient Prototypes for
Classifier Re-Balancing in Federated Long-Tailed Learning [60.41501515192088]
Federated Learning (FL) has become a popular distributed learning paradigm that involves multiple clients training a global model collaboratively.
The data samples usually follow a long-tailed distribution in the real world, and FL on the decentralized and long-tailed data yields a poorly-behaved global model.
In this work, we integrate the local real data with the global gradient prototypes to form the local balanced datasets.
arXiv Detail & Related papers (2023-01-25T03:18:10Z) - Federated Zero-Shot Learning for Visual Recognition [55.65879596326147]
We propose a novel Federated Zero-Shot Learning FedZSL framework.
FedZSL learns a central model from the decentralized data residing on edge devices.
The effectiveness and robustness of FedZSL are demonstrated by extensive experiments conducted on three zero-shot benchmark datasets.
arXiv Detail & Related papers (2022-09-05T14:49:34Z) - Efficient Augmentation for Imbalanced Deep Learning [8.38844520504124]
We study a convolutional neural network's internal representation of imbalanced image data.
We measure the generalization gap between a model's feature embeddings in the training and test sets, showing that the gap is wider for minority classes.
This insight enables us to design an efficient three-phase CNN training framework for imbalanced data.
arXiv Detail & Related papers (2022-07-13T09:43:17Z) - Acceleration of Federated Learning with Alleviated Forgetting in Local
Training [61.231021417674235]
Federated learning (FL) enables distributed optimization of machine learning models while protecting privacy.
We propose FedReg, an algorithm to accelerate FL with alleviated knowledge forgetting in the local training stage.
Our experiments demonstrate that FedReg not only significantly improves the convergence rate of FL, especially when the neural network architecture is deep.
arXiv Detail & Related papers (2022-03-05T02:31:32Z) - Federated learning with class imbalance reduction [24.044750119251308]
Federated learning (FL) is a technique that enables a large amount of edge computing devices to collaboratively train a global learning model.
Due to privacy concerns, the raw data on devices could not be available for centralized server.
In this paper, an estimation scheme is designed to reveal the class distribution without the awareness of raw data.
arXiv Detail & Related papers (2020-11-23T08:13:43Z) - WAFFLe: Weight Anonymized Factorization for Federated Learning [88.44939168851721]
In domains where data are sensitive or private, there is great value in methods that can learn in a distributed manner without the data ever leaving the local devices.
We propose Weight Anonymized Factorization for Federated Learning (WAFFLe), an approach that combines the Indian Buffet Process with a shared dictionary of weight factors for neural networks.
arXiv Detail & Related papers (2020-08-13T04:26:31Z) - Think Locally, Act Globally: Federated Learning with Local and Global
Representations [92.68484710504666]
Federated learning is a method of training models on private data distributed over multiple devices.
We propose a new federated learning algorithm that jointly learns compact local representations on each device.
We also evaluate on the task of personalized mood prediction from real-world mobile data where privacy is key.
arXiv Detail & Related papers (2020-01-06T12:40:21Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.