FedACK: Federated Adversarial Contrastive Knowledge Distillation for
Cross-Lingual and Cross-Model Social Bot Detection
- URL: http://arxiv.org/abs/2303.07113v1
- Date: Fri, 10 Mar 2023 03:10:08 GMT
- Title: FedACK: Federated Adversarial Contrastive Knowledge Distillation for
Cross-Lingual and Cross-Model Social Bot Detection
- Authors: Yingguang Yang, Renyu Yang, Hao Peng, Yangyang Li, Tong Li, Yong Liao,
Pengyuan Zhou
- Abstract summary: FedACK is a new adversarial contrastive knowledge distillation framework for social bot detection.
A global generator is used to extract the knowledge of global data distribution and distill it into each client's local model.
Experiments demonstrate that FedACK outperforms the state-of-the-art approaches in terms of accuracy, communication efficiency, and feature space consistency.
- Score: 22.979415040695557
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Social bot detection is of paramount importance to the resilience and
security of online social platforms. The state-of-the-art detection models are
siloed and have largely overlooked a variety of data characteristics from
multiple cross-lingual platforms. Meanwhile, the heterogeneity of data
distribution and model architecture makes it intricate to devise an efficient
cross-platform and cross-model detection framework. In this paper, we propose
FedACK, a new federated adversarial contrastive knowledge distillation
framework for social bot detection. We devise a GAN-based federated knowledge
distillation mechanism for efficiently transferring knowledge of data
distribution among clients. In particular, a global generator is used to
extract the knowledge of global data distribution and distill it into each
client's local model. We leverage local discriminator to enable customized
model design and use local generator for data enhancement with hard-to-decide
samples. Local training is conducted as multi-stage adversarial and contrastive
learning to enable consistent feature spaces among clients and to constrain the
optimization direction of local models, reducing the divergences between local
and global models. Experiments demonstrate that FedACK outperforms the
state-of-the-art approaches in terms of accuracy, communication efficiency, and
feature space consistency.
Related papers
- Proximity-based Self-Federated Learning [1.0066310107046081]
This paper introduces a novel, fully-distributed federated learning strategy called proximity-based self-federated learning.
Unlike traditional algorithms, our approach encourages clients to share and adjust their models with neighbouring nodes based on geographic proximity and model accuracy.
arXiv Detail & Related papers (2024-07-17T08:44:45Z) - An Aggregation-Free Federated Learning for Tackling Data Heterogeneity [50.44021981013037]
Federated Learning (FL) relies on the effectiveness of utilizing knowledge from distributed datasets.
Traditional FL methods adopt an aggregate-then-adapt framework, where clients update local models based on a global model aggregated by the server from the previous training round.
We introduce FedAF, a novel aggregation-free FL algorithm.
arXiv Detail & Related papers (2024-04-29T05:55:23Z) - Federated Learning with Projected Trajectory Regularization [65.6266768678291]
Federated learning enables joint training of machine learning models from distributed clients without sharing their local data.
One key challenge in federated learning is to handle non-identically distributed data across the clients.
We propose a novel federated learning framework with projected trajectory regularization (FedPTR) for tackling the data issue.
arXiv Detail & Related papers (2023-12-22T02:12:08Z) - pFedES: Model Heterogeneous Personalized Federated Learning with Feature
Extractor Sharing [19.403843478569303]
We propose a model-heterogeneous personalized Federated learning approach based on feature extractor sharing.
It incorporates a small homogeneous feature extractor into each client's heterogeneous local model.
It achieves 1.61% higher test accuracy, while reducing communication and computation costs by 99.6% and 82.9%, respectively.
arXiv Detail & Related papers (2023-11-12T15:43:39Z) - Rethinking Client Drift in Federated Learning: A Logit Perspective [125.35844582366441]
Federated Learning (FL) enables multiple clients to collaboratively learn in a distributed way, allowing for privacy protection.
We find that the difference in logits between the local and global models increases as the model is continuously updated.
We propose a new algorithm, named FedCSD, a Class prototype Similarity Distillation in a federated framework to align the local and global models.
arXiv Detail & Related papers (2023-08-20T04:41:01Z) - FedSoup: Improving Generalization and Personalization in Federated
Learning via Selective Model Interpolation [32.36334319329364]
Cross-silo federated learning (FL) enables the development of machine learning models on datasets distributed across data centers.
Recent research has found that current FL algorithms face a trade-off between local and global performance when confronted with distribution shifts.
We propose a novel federated model soup method to optimize the trade-off between local and global performance.
arXiv Detail & Related papers (2023-07-20T00:07:29Z) - CDKT-FL: Cross-Device Knowledge Transfer using Proxy Dataset in Federated Learning [27.84845136697669]
We develop a novel knowledge distillation-based approach to study the extent of knowledge transfer between the global model and local models.
We show the proposed method achieves significant speedups and high personalized performance of local models.
arXiv Detail & Related papers (2022-04-04T14:49:19Z) - FedDC: Federated Learning with Non-IID Data via Local Drift Decoupling
and Correction [48.85303253333453]
Federated learning (FL) allows multiple clients to collectively train a high-performance global model without sharing their private data.
We propose a novel federated learning algorithm with local drift decoupling and correction (FedDC)
Our FedDC only introduces lightweight modifications in the local training phase, in which each client utilizes an auxiliary local drift variable to track the gap between the local model parameter and the global model parameters.
Experiment results and analysis demonstrate that FedDC yields expediting convergence and better performance on various image classification tasks.
arXiv Detail & Related papers (2022-03-22T14:06:26Z) - Fine-tuning Global Model via Data-Free Knowledge Distillation for
Non-IID Federated Learning [86.59588262014456]
Federated Learning (FL) is an emerging distributed learning paradigm under privacy constraint.
We propose a data-free knowledge distillation method to fine-tune the global model in the server (FedFTG)
Our FedFTG significantly outperforms the state-of-the-art (SOTA) FL algorithms and can serve as a strong plugin for enhancing FedAvg, FedProx, FedDyn, and SCAFFOLD.
arXiv Detail & Related papers (2022-03-17T11:18:17Z) - Towards Fair Federated Learning with Zero-Shot Data Augmentation [123.37082242750866]
Federated learning has emerged as an important distributed learning paradigm, where a server aggregates a global model from many client-trained models while having no access to the client data.
We propose a novel federated learning system that employs zero-shot data augmentation on under-represented data to mitigate statistical heterogeneity and encourage more uniform accuracy performance across clients in federated networks.
We study two variants of this scheme, Fed-ZDAC (federated learning with zero-shot data augmentation at the clients) and Fed-ZDAS (federated learning with zero-shot data augmentation at the server).
arXiv Detail & Related papers (2021-04-27T18:23:54Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.