B^2SFL: A Bi-level Blockchained Architecture for Secure Federated
Learning-based Traffic Prediction
- URL: http://arxiv.org/abs/2310.14669v1
- Date: Mon, 23 Oct 2023 08:06:05 GMT
- Title: B^2SFL: A Bi-level Blockchained Architecture for Secure Federated
Learning-based Traffic Prediction
- Authors: Hao Guo, Collin Meese, Wanxin Li, Chien-Chung Shen, Mark Nejad
- Abstract summary: Federated Learning (FL) is a privacy-preserving machine learning technology.
Security and privacy guarantees could be compromised due to malicious participants and the centralized FL server.
This article proposed a bi-level blockchained architecture for secure federated learning-based traffic prediction.
- Score: 4.3030251749726345
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Federated Learning (FL) is a privacy-preserving machine learning (ML)
technology that enables collaborative training and learning of a global ML
model based on aggregating distributed local model updates. However, security
and privacy guarantees could be compromised due to malicious participants and
the centralized FL server. This article proposed a bi-level blockchained
architecture for secure federated learning-based traffic prediction. The bottom
and top layer blockchain store the local model and global aggregated parameters
accordingly, and the distributed homomorphic-encrypted federated averaging
(DHFA) scheme addresses the secure computation problems. We propose the partial
private key distribution protocol and a partially homomorphic
encryption/decryption scheme to achieve the distributed privacy-preserving
federated averaging model. We conduct extensive experiments to measure the
running time of DHFA operations, quantify the read and write performance of the
blockchain network, and elucidate the impacts of varying regional group sizes
and model complexities on the resulting prediction accuracy for the online
traffic flow prediction task. The results indicate that the proposed system can
facilitate secure and decentralized federated learning for real-world traffic
prediction tasks.
Related papers
- Enhancing Trust and Privacy in Distributed Networks: A Comprehensive Survey on Blockchain-based Federated Learning [51.13534069758711]
Decentralized approaches like blockchain offer a compelling solution by implementing a consensus mechanism among multiple entities.
Federated Learning (FL) enables participants to collaboratively train models while safeguarding data privacy.
This paper investigates the synergy between blockchain's security features and FL's privacy-preserving model training capabilities.
arXiv Detail & Related papers (2024-03-28T07:08:26Z) - PPBFL: A Privacy Protected Blockchain-based Federated Learning Model [6.278098707317501]
We propose a Protected-based Federated Learning Model (PPBFL) to enhance the security of federated learning.
We introduce a Proof of Training Work (PoTW) algorithm tailored for federated learning, aiming to incentive training nodes.
We also propose a new mix transactions mechanism utilizing ring signature technology to better protect the identity privacy of local training clients.
arXiv Detail & Related papers (2024-01-02T13:13:28Z) - The Implications of Decentralization in Blockchained Federated Learning: Evaluating the Impact of Model Staleness and Inconsistencies [2.6391879803618115]
We study the practical implications of outsourcing the orchestration of federated learning to a democratic setting such as in a blockchain.
Using simulation, we evaluate the blockchained FL operation by applying two different ML models on the well-known MNIST and CIFAR-10 datasets.
Our results show the high impact of model inconsistencies on the accuracy of the models (up to a 35% decrease in prediction accuracy)
arXiv Detail & Related papers (2023-10-11T13:18:23Z) - Federated Nearest Neighbor Machine Translation [66.8765098651988]
In this paper, we propose a novel federated nearest neighbor (FedNN) machine translation framework.
FedNN leverages one-round memorization-based interaction to share knowledge across different clients.
Experiments show that FedNN significantly reduces computational and communication costs compared with FedAvg.
arXiv Detail & Related papers (2023-02-23T18:04:07Z) - RoFL: Attestable Robustness for Secure Federated Learning [59.63865074749391]
Federated Learning allows a large number of clients to train a joint model without the need to share their private data.
To ensure the confidentiality of the client updates, Federated Learning systems employ secure aggregation.
We present RoFL, a secure Federated Learning system that improves robustness against malicious clients.
arXiv Detail & Related papers (2021-07-07T15:42:49Z) - Privacy-Preserved Blockchain-Federated-Learning for Medical Image
Analysis Towards Multiple Parties [5.296010468961924]
This article designs a privacy-preserving framework based on federated learning and blockchain.
In the first step, we train the local model by using the capsule network for the segmentation and classification of the COVID-19 images.
In the second step, we secure the local model through the homomorphic encryption scheme.
arXiv Detail & Related papers (2021-04-22T07:32:04Z) - Blockchain Assisted Decentralized Federated Learning (BLADE-FL):
Performance Analysis and Resource Allocation [119.19061102064497]
We propose a decentralized FL framework by integrating blockchain into FL, namely, blockchain assisted decentralized federated learning (BLADE-FL)
In a round of the proposed BLADE-FL, each client broadcasts its trained model to other clients, competes to generate a block based on the received models, and then aggregates the models from the generated block before its local training of the next round.
We explore the impact of lazy clients on the learning performance of BLADE-FL, and characterize the relationship among the optimal K, the learning parameters, and the proportion of lazy clients.
arXiv Detail & Related papers (2021-01-18T07:19:08Z) - WAFFLe: Weight Anonymized Factorization for Federated Learning [88.44939168851721]
In domains where data are sensitive or private, there is great value in methods that can learn in a distributed manner without the data ever leaving the local devices.
We propose Weight Anonymized Factorization for Federated Learning (WAFFLe), an approach that combines the Indian Buffet Process with a shared dictionary of weight factors for neural networks.
arXiv Detail & Related papers (2020-08-13T04:26:31Z) - Privacy-preserving Traffic Flow Prediction: A Federated Learning
Approach [61.64006416975458]
We propose a privacy-preserving machine learning technique named Federated Learning-based Gated Recurrent Unit neural network algorithm (FedGRU) for traffic flow prediction.
FedGRU differs from current centralized learning methods and updates universal learning models through a secure parameter aggregation mechanism.
It is shown that FedGRU's prediction accuracy is 90.96% higher than the advanced deep learning models.
arXiv Detail & Related papers (2020-03-19T13:07:49Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.