Multi-dimensional Data Quick Query for Blockchain-based Federated Learning
- URL: http://arxiv.org/abs/2309.15348v1
- Date: Wed, 27 Sep 2023 01:35:11 GMT
- Title: Multi-dimensional Data Quick Query for Blockchain-based Federated Learning
- Authors: Jiaxi Yang, Sheng Cao, Peng xiangLi, Xiong Li, Xiaosong Zhang,
- Abstract summary: We propose a novel data structure to improve the query efficiency within each block named MerkleRB-Tree.
In detail, we leverage Minimal Bounding Rectangle(MBR) and bloom-filters for the query process of multi-dimensional continuous-valued attributes and discrete-valued attributes respectively.
- Score: 6.499393722730449
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Due to the drawbacks of Federated Learning (FL) such as vulnerability of a single central server, centralized federated learning is shifting to decentralized federated learning, a paradigm which takes the advantages of blockchain. A key enabler for adoption of blockchain-based federated learning is how to select suitable participants to train models collaboratively. Selecting participants by storing and querying the metadata of data owners on blockchain could ensure the reliability of selected data owners, which is helpful to obtain high-quality models in FL. However, querying multi-dimensional metadata on blockchain needs to traverse every transaction in each block, making the query time-consuming. An efficient query method for multi-dimensional metadata in the blockchain for selecting participants in FL is absent and challenging. In this paper, we propose a novel data structure to improve the query efficiency within each block named MerkleRB-Tree. In detail, we leverage Minimal Bounding Rectangle(MBR) and bloom-filters for the query process of multi-dimensional continuous-valued attributes and discrete-valued attributes respectively. Furthermore, we migrate the idea of the skip list along with an MBR and a bloom filter at the head of each block to enhance the query efficiency for inter-blocks. The performance analysis and extensive evaluation results on the benchmark dataset demonstrate the superiority of our method in blockchain-based FL.
Related papers
- Enhancing Trust and Privacy in Distributed Networks: A Comprehensive Survey on Blockchain-based Federated Learning [51.13534069758711]
Decentralized approaches like blockchain offer a compelling solution by implementing a consensus mechanism among multiple entities.
Federated Learning (FL) enables participants to collaboratively train models while safeguarding data privacy.
This paper investigates the synergy between blockchain's security features and FL's privacy-preserving model training capabilities.
arXiv Detail & Related papers (2024-03-28T07:08:26Z) - BlockFUL: Enabling Unlearning in Blockchained Federated Learning [26.47424619448623]
Unlearning in Federated Learning (FL) presents significant challenges, as models grow and evolve with complex inheritance relationships.
In this paper, we introduce a novel framework with a dual-chain structure comprising a live chain and an archive chain for enabling unlearning capabilities withined FL.
Two new unlearning paradigms, i.e., parallel and sequential paradigms, can be effectively implemented through gradient-ascent-based and re-training-based unlearning methods.
Our experiments validate that these methods effectively reduce data dependency and operational overhead, thereby boosting the overall performance of unlearning inherited models within BlockFUL.
arXiv Detail & Related papers (2024-02-26T04:31:53Z) - Robust softmax aggregation on blockchain based federated learning with convergence guarantee [11.955062839855334]
We propose a softmax aggregation blockchain based federated learning framework.
First, we propose a new blockchain based federated learning architecture that utilizes the well-tested proof-of-stake consensus mechanism.
Second, to ensure the robustness of the aggregation process, we design a novel softmax aggregation method.
arXiv Detail & Related papers (2023-11-13T02:25:52Z) - The Implications of Decentralization in Blockchained Federated Learning: Evaluating the Impact of Model Staleness and Inconsistencies [2.6391879803618115]
We study the practical implications of outsourcing the orchestration of federated learning to a democratic setting such as in a blockchain.
Using simulation, we evaluate the blockchained FL operation by applying two different ML models on the well-known MNIST and CIFAR-10 datasets.
Our results show the high impact of model inconsistencies on the accuracy of the models (up to a 35% decrease in prediction accuracy)
arXiv Detail & Related papers (2023-10-11T13:18:23Z) - Incentive Mechanism Design for Joint Resource Allocation in
Blockchain-based Federated Learning [23.64441447666488]
We propose an incentive mechanism to assign each client appropriate rewards for training and mining.
We transform the Stackelberg game model into two optimization problems, which are sequentially solved to derive the optimal strategies for both the model owner and clients.
arXiv Detail & Related papers (2022-02-18T02:19:26Z) - Multi-Center Federated Learning [62.32725938999433]
Federated learning (FL) can protect data privacy in distributed learning.
It merely collects local gradients from users without access to their data.
We propose a novel multi-center aggregation mechanism.
arXiv Detail & Related papers (2021-08-19T12:20:31Z) - Blockchain Assisted Decentralized Federated Learning (BLADE-FL):
Performance Analysis and Resource Allocation [119.19061102064497]
We propose a decentralized FL framework by integrating blockchain into FL, namely, blockchain assisted decentralized federated learning (BLADE-FL)
In a round of the proposed BLADE-FL, each client broadcasts its trained model to other clients, competes to generate a block based on the received models, and then aggregates the models from the generated block before its local training of the next round.
We explore the impact of lazy clients on the learning performance of BLADE-FL, and characterize the relationship among the optimal K, the learning parameters, and the proportion of lazy clients.
arXiv Detail & Related papers (2021-01-18T07:19:08Z) - Blockchain Assisted Decentralized Federated Learning (BLADE-FL) with
Lazy Clients [124.48732110742623]
We propose a novel framework by integrating blockchain into Federated Learning (FL)
BLADE-FL has a good performance in terms of privacy preservation, tamper resistance, and effective cooperation of learning.
It gives rise to a new problem of training deficiency, caused by lazy clients who plagiarize others' trained models and add artificial noises to conceal their cheating behaviors.
arXiv Detail & Related papers (2020-12-02T12:18:27Z) - Resource Management for Blockchain-enabled Federated Learning: A Deep
Reinforcement Learning Approach [54.29213445674221]
Federated Learning (BFL) enables mobile devices to collaboratively train neural network models required by a Machine Learning Model Owner (MLMO)
The issue of BFL is that the mobile devices have energy and CPU constraints that may reduce the system lifetime and training efficiency.
We propose to use the Deep Reinforcement Learning (DRL) to derive the optimal decisions for theO.
arXiv Detail & Related papers (2020-04-08T16:29:19Z) - Distributed Optimization over Block-Cyclic Data [48.317899174302305]
We consider practical data characteristics underlying federated learning, where unbalanced and non-i.i.d. data from clients have a block-cyclic structure.
We propose two new distributed optimization algorithms called multi-model parallel SGD (MM-PSGD) and multi-chain parallel SGD (MC-PSGD)
Our algorithms significantly outperform the conventional federated averaging algorithm in terms of test accuracy, and also preserve robustness for the variance of critical parameters.
arXiv Detail & Related papers (2020-02-18T09:47:15Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.