DID-eFed: Facilitating Federated Learning as a Service with
Decentralized Identities
- URL: http://arxiv.org/abs/2105.08671v2
- Date: Wed, 19 May 2021 07:44:07 GMT
- Title: DID-eFed: Facilitating Federated Learning as a Service with
Decentralized Identities
- Authors: Jiahui Geng, Neel Kanwal, Martin Gilje Jaatun, Chunming Rong
- Abstract summary: Federated learning (FL) emerges as a functional solution to build high-performance models shared among multiple parties.
We present DID-eFed, where FL is facilitated by decentralized identities (DID) and a smart contract.
We describe particularly the scenario where our DID-eFed enables the FL among hospitals and research institutions.
- Score: 0.11470070927586015
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We have entered the era of big data, and it is considered to be the "fuel"
for the flourishing of artificial intelligence applications. The enactment of
the EU General Data Protection Regulation (GDPR) raises concerns about
individuals' privacy in big data. Federated learning (FL) emerges as a
functional solution that can help build high-performance models shared among
multiple parties while still complying with user privacy and data
confidentiality requirements. Although FL has been intensively studied and used
in real applications, there is still limited research related to its prospects
and applications as a FLaaS (Federated Learning as a Service) to interested 3rd
parties. In this paper, we present a FLaaS system: DID-eFed, where FL is
facilitated by decentralized identities (DID) and a smart contract. DID enables
a more flexible and credible decentralized access management in our system,
while the smart contract offers a frictionless and less error-prone process. We
describe particularly the scenario where our DID-eFed enables the FLaaS among
hospitals and research institutions.
Related papers
- Advances in APPFL: A Comprehensive and Extensible Federated Learning Framework [1.4206132527980742]
Federated learning (FL) is a distributed machine learning paradigm enabling collaborative model training while preserving data privacy.
We present the recent advances in developing APPFL, a framework and benchmarking suite for federated learning.
We demonstrate the capabilities of APPFL through extensive experiments evaluating various aspects of FL, including communication efficiency, privacy preservation, computational performance, and resource utilization.
arXiv Detail & Related papers (2024-09-17T22:20:26Z) - Federated Learning for 6G: Paradigms, Taxonomy, Recent Advances and
Insights [52.024964564408]
This paper examines the added-value of implementing Federated Learning throughout all levels of the protocol stack.
It presents important FL applications, addresses hot topics, provides valuable insights and explicits guidance for future research and developments.
Our concluding remarks aim to leverage the synergy between FL and future 6G, while highlighting FL's potential to revolutionize wireless industry.
arXiv Detail & Related papers (2023-12-07T20:39:57Z) - A Survey of Federated Unlearning: A Taxonomy, Challenges and Future
Directions [71.16718184611673]
The evolution of privacy-preserving Federated Learning (FL) has led to an increasing demand for implementing the right to be forgotten.
The implementation of selective forgetting is particularly challenging in FL due to its decentralized nature.
Federated Unlearning (FU) emerges as a strategic solution to address the increasing need for data privacy.
arXiv Detail & Related papers (2023-10-30T01:34:33Z) - UFed-GAN: A Secure Federated Learning Framework with Constrained
Computation and Unlabeled Data [50.13595312140533]
We propose a novel framework of UFed-GAN: Unsupervised Federated Generative Adversarial Network, which can capture user-side data distribution without local classification training.
Our experimental results demonstrate the strong potential of UFed-GAN in addressing limited computational resources and unlabeled data while preserving privacy.
arXiv Detail & Related papers (2023-08-10T22:52:13Z) - A Survey on Decentralized Federated Learning [0.709016563801433]
In recent years, federated learning has become a popular paradigm for training distributed, large-scale, and privacy-preserving machine learning (ML) systems.
In a typical FL system, the central server acts only as an orchestrator; it iteratively gathers and aggregates all the local models trained by each client on its private data until convergence.
One of the most critical challenges is to overcome the centralized orchestration of the classical FL client-server architecture.
Decentralized FL solutions have emerged where all FL clients cooperate and communicate without a central server.
arXiv Detail & Related papers (2023-08-08T22:07:15Z) - Federated Learning with Privacy-Preserving Ensemble Attention
Distillation [63.39442596910485]
Federated Learning (FL) is a machine learning paradigm where many local nodes collaboratively train a central model while keeping the training data decentralized.
We propose a privacy-preserving FL framework leveraging unlabeled public data for one-way offline knowledge distillation.
Our technique uses decentralized and heterogeneous local data like existing FL approaches, but more importantly, it significantly reduces the risk of privacy leakage.
arXiv Detail & Related papers (2022-10-16T06:44:46Z) - Federated Learning: Applications, Challenges and Future Scopes [1.3190581566723918]
Federated learning (FL) is a system in which a central aggregator coordinates the efforts of multiple clients to solve machine learning problems.
FL has applications in wireless communication, service recommendation, intelligent medical diagnosis systems, and healthcare.
arXiv Detail & Related papers (2022-05-18T10:47:09Z) - FedComm: Federated Learning as a Medium for Covert Communication [56.376997104843355]
Federated Learning (FL) is a solution to mitigate the privacy implications related to the adoption of deep learning.
This paper thoroughly investigates the communication capabilities of an FL scheme.
We introduce FedComm, a novel multi-system covert-communication technique.
arXiv Detail & Related papers (2022-01-21T17:05:56Z) - FedLess: Secure and Scalable Federated Learning Using Serverless
Computing [1.141832715860866]
Federated Learning (FL) enables remote clients to learn a shared ML model while keeping the data local.
We present a novel system and framework for serverless FL, called FedLess.
Our system supports multiple commercial and self-hosted F providers and can be deployed in the cloud, on-premise in institutional data centers, and on edge devices.
arXiv Detail & Related papers (2021-11-05T11:14:07Z) - FedNLP: A Research Platform for Federated Learning in Natural Language
Processing [55.01246123092445]
We present the FedNLP, a research platform for federated learning in NLP.
FedNLP supports various popular task formulations in NLP such as text classification, sequence tagging, question answering, seq2seq generation, and language modeling.
Preliminary experiments with FedNLP reveal that there exists a large performance gap between learning on decentralized and centralized datasets.
arXiv Detail & Related papers (2021-04-18T11:04:49Z) - Privacy Preservation in Federated Learning: An insightful survey from
the GDPR Perspective [10.901568085406753]
Article is dedicated to surveying on the state-of-the-art privacy techniques, which can be employed in Federated learning.
Recent research has demonstrated that retaining data and on computation in FL is not enough for privacy-guarantee.
This is because ML model parameters exchanged between parties in an FL system, which can be exploited in some privacy attacks.
arXiv Detail & Related papers (2020-11-10T21:41:25Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.