BRFL: A Blockchain-based Byzantine-Robust Federated Learning Model
- URL: http://arxiv.org/abs/2310.13403v1
- Date: Fri, 20 Oct 2023 10:21:50 GMT
- Title: BRFL: A Blockchain-based Byzantine-Robust Federated Learning Model
- Authors: Yang Li, Chunhe Xia, Chang Li, Tianbo Wang
- Abstract summary: Federated learning, which stores data in distributed nodes and shares only model parameters, has gained significant attention for addressing this concern.
A challenge arises in federated learning due to the Byzantine Attack Problem, where malicious local models can compromise the global model's performance during aggregation.
This article proposes the integration of Byzantine-Robust Federated Learning (BRLF) model that combines federated learning with blockchain technology.
- Score: 8.19957400564017
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: With the increasing importance of machine learning, the privacy and security
of training data have become critical. Federated learning, which stores data in
distributed nodes and shares only model parameters, has gained significant
attention for addressing this concern. However, a challenge arises in federated
learning due to the Byzantine Attack Problem, where malicious local models can
compromise the global model's performance during aggregation. This article
proposes the Blockchain-based Byzantine-Robust Federated Learning (BRLF) model
that combines federated learning with blockchain technology. This integration
enables traceability of malicious models and provides incentives for locally
trained clients. Our approach involves selecting the aggregation node based on
Pearson's correlation coefficient, and we perform spectral clustering and
calculate the average gradient within each cluster, validating its accuracy
using local dataset of the aggregation nodes. Experimental results on public
datasets demonstrate the superior byzantine robustness of our secure
aggregation algorithm compared to other baseline byzantine robust aggregation
methods, and proved our proposed model effectiveness in addressing the resource
consumption problem.
Related papers
- An Aggregation-Free Federated Learning for Tackling Data Heterogeneity [50.44021981013037]
Federated Learning (FL) relies on the effectiveness of utilizing knowledge from distributed datasets.
Traditional FL methods adopt an aggregate-then-adapt framework, where clients update local models based on a global model aggregated by the server from the previous training round.
We introduce FedAF, a novel aggregation-free FL algorithm.
arXiv Detail & Related papers (2024-04-29T05:55:23Z) - Fake It Till Make It: Federated Learning with Consensus-Oriented
Generation [52.82176415223988]
We propose federated learning with consensus-oriented generation (FedCOG)
FedCOG consists of two key components at the client side: complementary data generation and knowledge-distillation-based model training.
Experiments on classical and real-world FL datasets show that FedCOG consistently outperforms state-of-the-art methods.
arXiv Detail & Related papers (2023-12-10T18:49:59Z) - Federated cINN Clustering for Accurate Clustered Federated Learning [33.72494731516968]
Federated Learning (FL) presents an innovative approach to privacy-preserving distributed machine learning.
We propose the Federated cINN Clustering Algorithm (FCCA) to robustly cluster clients into different groups.
arXiv Detail & Related papers (2023-09-04T10:47:52Z) - Reinforcement Federated Learning Method Based on Adaptive OPTICS
Clustering [19.73560248813166]
This paper proposes an adaptive OPTICS clustering algorithm for federated learning.
By perceiving the clustering environment as a Markov decision process, the goal is to find the best parameters of the OPTICS cluster.
The reliability and practicability of this method have been verified on the experimental data, and its effec-tiveness and superiority have been proved.
arXiv Detail & Related papers (2023-06-22T13:11:19Z) - Towards Understanding and Mitigating Dimensional Collapse in Heterogeneous Federated Learning [112.69497636932955]
Federated learning aims to train models across different clients without the sharing of data for privacy considerations.
We study how data heterogeneity affects the representations of the globally aggregated models.
We propose sc FedDecorr, a novel method that can effectively mitigate dimensional collapse in federated learning.
arXiv Detail & Related papers (2022-10-01T09:04:17Z) - Fed-CBS: A Heterogeneity-Aware Client Sampling Mechanism for Federated
Learning via Class-Imbalance Reduction [76.26710990597498]
We show that the class-imbalance of the grouped data from randomly selected clients can lead to significant performance degradation.
Based on our key observation, we design an efficient client sampling mechanism, i.e., Federated Class-balanced Sampling (Fed-CBS)
In particular, we propose a measure of class-imbalance and then employ homomorphic encryption to derive this measure in a privacy-preserving way.
arXiv Detail & Related papers (2022-09-30T05:42:56Z) - Robust Federated Learning via Over-The-Air Computation [48.47690125123958]
Simple averaging of model updates via over-the-air computation makes the learning task vulnerable to random or intended modifications of the local model updates of some malicious clients.
We propose a robust transmission and aggregation framework to such attacks while preserving the benefits of over-the-air computation for federated learning.
arXiv Detail & Related papers (2021-11-01T19:21:21Z) - RobustFed: A Truth Inference Approach for Robust Federated Learning [9.316565110931743]
Federated learning is a framework that enables clients to train a collaboratively global model under a central server's orchestration.
The aggregation step in federated learning is vulnerable to adversarial attacks as the central server cannot manage clients' behavior.
We propose a novel robust aggregation algorithm inspired by the truth inference methods in crowdsourcing.
arXiv Detail & Related papers (2021-07-18T09:34:57Z) - Blockchain Assisted Decentralized Federated Learning (BLADE-FL):
Performance Analysis and Resource Allocation [119.19061102064497]
We propose a decentralized FL framework by integrating blockchain into FL, namely, blockchain assisted decentralized federated learning (BLADE-FL)
In a round of the proposed BLADE-FL, each client broadcasts its trained model to other clients, competes to generate a block based on the received models, and then aggregates the models from the generated block before its local training of the next round.
We explore the impact of lazy clients on the learning performance of BLADE-FL, and characterize the relationship among the optimal K, the learning parameters, and the proportion of lazy clients.
arXiv Detail & Related papers (2021-01-18T07:19:08Z) - Byzantine-Resilient Secure Federated Learning [2.578242050187029]
This paper presents the first single-server Byzantine-resilient secure aggregation framework (BREA) for secure federated learning.
BREA is based on an integrated, verifiable detection, and secure model aggregation approach to guarantee Byzantine-resilience convergence simultaneously.
Our experiments demonstrate convergence in the presence of Byzantine users, and comparable accuracy to conventional federated learning benchmarks.
arXiv Detail & Related papers (2020-07-21T22:15:11Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.