FAIR-BFL: Flexible and Incentive Redesign for Blockchain-based Federated
Learning
- URL: http://arxiv.org/abs/2206.12899v1
- Date: Sun, 26 Jun 2022 15:20:45 GMT
- Title: FAIR-BFL: Flexible and Incentive Redesign for Blockchain-based Federated
Learning
- Authors: Rongxin Xu, Shiva Raj Pokhrel, Qiujun Lan, and Gang Li
- Abstract summary: Vanilla Federated learning (FL) relies on the centralized global aggregation mechanism and assumes that all clients are honest.
This makes it a challenge for FL to alleviate the single point of failure and dishonest clients.
We design and evaluate a novel BFL framework, and resolve the identified challenges in vanilla BFL with greater flexibility and incentive mechanism called FAIR-BFL.
- Score: 19.463891024499773
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Vanilla Federated learning (FL) relies on the centralized global aggregation
mechanism and assumes that all clients are honest. This makes it a challenge
for FL to alleviate the single point of failure and dishonest clients. These
impending challenges in the design philosophy of FL call for blockchain-based
federated learning (BFL) due to the benefits of coupling FL and blockchain
(e.g., democracy, incentive, and immutability). However, one problem in vanilla
BFL is that its capabilities do not follow adopters' needs in a dynamic
fashion. Besides, vanilla BFL relies on unverifiable clients' self-reported
contributions like data size because checking clients' raw data is not allowed
in FL for privacy concerns. We design and evaluate a novel BFL framework, and
resolve the identified challenges in vanilla BFL with greater flexibility and
incentive mechanism called FAIR-BFL. In contrast to existing works, FAIR-BFL
offers unprecedented flexibility via the modular design, allowing adopters to
adjust its capabilities following business demands in a dynamic fashion. Our
design accounts for BFL's ability to quantify each client's contribution to the
global learning process. Such quantification provides a rational metric for
distributing the rewards among federated clients and helps discover malicious
participants that may poison the global model.
Related papers
- Enhancing Trust and Privacy in Distributed Networks: A Comprehensive Survey on Blockchain-based Federated Learning [51.13534069758711]
Decentralized approaches like blockchain offer a compelling solution by implementing a consensus mechanism among multiple entities.
Federated Learning (FL) enables participants to collaboratively train models while safeguarding data privacy.
This paper investigates the synergy between blockchain's security features and FL's privacy-preserving model training capabilities.
arXiv Detail & Related papers (2024-03-28T07:08:26Z) - Blockchain-empowered Federated Learning: Benefits, Challenges, and Solutions [31.18229828293164]
Federated learning (FL) is a distributed machine learning approach that protects user data privacy by training models locally on clients and aggregating them on a parameter server.
While effective at preserving privacy, FL systems face limitations such as single points of failure, lack of incentives, and inadequate security.
To address these challenges, blockchain technology is integrated into FL systems to provide stronger security, fairness, and scalability.
arXiv Detail & Related papers (2024-03-01T07:41:05Z) - Bayesian Federated Learning: A Survey [54.40136267717288]
Federated learning (FL) demonstrates its advantages in integrating distributed infrastructure, communication, computing and learning in a privacy-preserving manner.
The robustness and capabilities of existing FL methods are challenged by limited and dynamic data and conditions.
BFL has emerged as a promising approach to address these issues.
arXiv Detail & Related papers (2023-04-26T03:41:17Z) - A Fast Blockchain-based Federated Learning Framework with Compressed
Communications [14.344080339573278]
Recently, blockchain-based federated learning (BFL) has attracted intensive research attention.
In this paper, we propose a fast-based BFL called BCFL to improve the training efficiency of BFL in reality.
arXiv Detail & Related papers (2022-08-12T03:04:55Z) - A Systematic Survey of Blockchained Federated Learning [22.710611199826925]
Federated learning (FL) can prevent privacy leakage by assigning training tasks to multiple clients.
FL still suffers from shortcomings such as single-point-failure and malicious data.
The emergence of blockchain provides a secure and efficient solution for the deployment of FL.
arXiv Detail & Related papers (2021-10-05T17:21:52Z) - Blockchain Assisted Decentralized Federated Learning (BLADE-FL):
Performance Analysis and Resource Allocation [119.19061102064497]
We propose a decentralized FL framework by integrating blockchain into FL, namely, blockchain assisted decentralized federated learning (BLADE-FL)
In a round of the proposed BLADE-FL, each client broadcasts its trained model to other clients, competes to generate a block based on the received models, and then aggregates the models from the generated block before its local training of the next round.
We explore the impact of lazy clients on the learning performance of BLADE-FL, and characterize the relationship among the optimal K, the learning parameters, and the proportion of lazy clients.
arXiv Detail & Related papers (2021-01-18T07:19:08Z) - Robust Blockchained Federated Learning with Model Validation and
Proof-of-Stake Inspired Consensus [43.12040317316018]
Federated learning (FL) is a promising distributed learning solution that only exchanges model parameters without revealing raw data.
We propose a blockchain-based decentralized FL framework, termed VBFL, by exploiting two mechanisms in a blockchained architecture.
With 15% of malicious devices, VBFL achieves 87% accuracy, which is 7.4x higher than Vanilla FL.
arXiv Detail & Related papers (2021-01-09T06:30:38Z) - Blockchain Assisted Decentralized Federated Learning (BLADE-FL) with
Lazy Clients [124.48732110742623]
We propose a novel framework by integrating blockchain into Federated Learning (FL)
BLADE-FL has a good performance in terms of privacy preservation, tamper resistance, and effective cooperation of learning.
It gives rise to a new problem of training deficiency, caused by lazy clients who plagiarize others' trained models and add artificial noises to conceal their cheating behaviors.
arXiv Detail & Related papers (2020-12-02T12:18:27Z) - GFL: A Decentralized Federated Learning Framework Based On Blockchain [15.929643607462353]
We propose Galaxy Federated Learning Framework(GFL), a decentralized FL framework based on blockchain.
GFL introduces the consistent hashing algorithm to improve communication performance and proposes a novel ring decentralized FL algorithm(RDFL) to improve decentralized FL performance and bandwidth utilization.
Our experiments show that GFL improves communication performance and decentralized FL performance under the data poisoning of malicious nodes and non-independent and identically distributed(Non-IID) datasets.
arXiv Detail & Related papers (2020-10-21T13:36:59Z) - Resource Management for Blockchain-enabled Federated Learning: A Deep
Reinforcement Learning Approach [54.29213445674221]
Federated Learning (BFL) enables mobile devices to collaboratively train neural network models required by a Machine Learning Model Owner (MLMO)
The issue of BFL is that the mobile devices have energy and CPU constraints that may reduce the system lifetime and training efficiency.
We propose to use the Deep Reinforcement Learning (DRL) to derive the optimal decisions for theO.
arXiv Detail & Related papers (2020-04-08T16:29:19Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.