Federated Learning with Quantum Computing and Fully Homomorphic Encryption: A Novel Computing Paradigm Shift in Privacy-Preserving ML
- URL: http://arxiv.org/abs/2409.11430v3
- Date: Sat, 12 Oct 2024 10:51:52 GMT
- Title: Federated Learning with Quantum Computing and Fully Homomorphic Encryption: A Novel Computing Paradigm Shift in Privacy-Preserving ML
- Authors: Siddhant Dutta, Pavana P Karanth, Pedro Maciel Xavier, Iago Leal de Freitas, Nouhaila Innan, Sadok Ben Yahia, Muhammad Shafique, David E. Bernal Neira,
- Abstract summary: Federated Learning is a privacy-preserving alternative to conventional methods that allow multiple learning clients to share model knowledge without disclosing private data.
This work applies the Fully Homomorphic Encryption scheme to a Federated Learning Neural Network architecture that integrates both classical and quantum layers.
- Score: 4.92218040320554
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: The widespread deployment of products powered by machine learning models is raising concerns around data privacy and information security worldwide. To address this issue, Federated Learning was first proposed as a privacy-preserving alternative to conventional methods that allow multiple learning clients to share model knowledge without disclosing private data. A complementary approach known as Fully Homomorphic Encryption (FHE) is a quantum-safe cryptographic system that enables operations to be performed on encrypted weights. However, implementing mechanisms such as these in practice often comes with significant computational overhead and can expose potential security threats. Novel computing paradigms, such as analog, quantum, and specialized digital hardware, present opportunities for implementing privacy-preserving machine learning systems while enhancing security and mitigating performance loss. This work instantiates these ideas by applying the FHE scheme to a Federated Learning Neural Network architecture that integrates both classical and quantum layers.
Related papers
- Enhancing Quantum Security over Federated Learning via Post-Quantum Cryptography [38.77135346831741]
Federated learning (FL) has become one of the standard approaches for deploying machine learning models on edge devices.
Current digital signature algorithms can protect these communicated model updates, but they fail to ensure quantum security in the era of large-scale quantum computing.
In this work, we empirically investigate the impact of these three NIST-standardized PQC algorithms for digital signatures within the FL procedure.
arXiv Detail & Related papers (2024-09-06T22:02:08Z) - Federated Learning is Better with Non-Homomorphic Encryption [1.4110007887109783]
Federated Learning (FL) offers a paradigm that empowers distributed AI model training without collecting raw data.
One of the popular methodologies is employing Homomorphic Encryption (HE)
We propose an innovative framework that synergizes permutation-based compressors with Classical Cryptography.
arXiv Detail & Related papers (2023-12-04T17:37:41Z) - ShadowNet for Data-Centric Quantum System Learning [188.683909185536]
We propose a data-centric learning paradigm combining the strength of neural-network protocols and classical shadows.
Capitalizing on the generalization power of neural networks, this paradigm can be trained offline and excel at predicting previously unseen systems.
We present the instantiation of our paradigm in quantum state tomography and direct fidelity estimation tasks and conduct numerical analysis up to 60 qubits.
arXiv Detail & Related papers (2023-08-22T09:11:53Z) - Privacy-Preserving Chaotic Extreme Learning Machine with Fully
Homomorphic Encryption [5.010425616264462]
We propose a Chaotic Extreme Learning Machine and its encrypted form using Fully Homomorphic Encryption.
Our proposed method has performed either better or similar to the Traditional Extreme Learning Machine on most of the datasets.
arXiv Detail & Related papers (2022-08-04T11:29:52Z) - SoK: Privacy-preserving Deep Learning with Homomorphic Encryption [2.9069679115858755]
homomorphic encryption (HE) can be performed on encrypted data without revealing its content.
We take an in-depth look at approaches that combine neural networks with HE for privacy preservation.
We find numerous challenges to HE based privacy-preserving deep learning such as computational overhead, usability, and limitations posed by the encryption schemes.
arXiv Detail & Related papers (2021-12-23T22:03:27Z) - RoFL: Attestable Robustness for Secure Federated Learning [59.63865074749391]
Federated Learning allows a large number of clients to train a joint model without the need to share their private data.
To ensure the confidentiality of the client updates, Federated Learning systems employ secure aggregation.
We present RoFL, a secure Federated Learning system that improves robustness against malicious clients.
arXiv Detail & Related papers (2021-07-07T15:42:49Z) - TenSEAL: A Library for Encrypted Tensor Operations Using Homomorphic
Encryption [0.0]
We present TenSEAL, an open-source library for Privacy-Preserving Machine Learning using Homomorphic Encryption.
We show that an encrypted convolutional neural network can be evaluated in less than a second, using less than half a megabyte of communication.
arXiv Detail & Related papers (2021-04-07T14:32:38Z) - Dos and Don'ts of Machine Learning in Computer Security [74.1816306998445]
Despite great potential, machine learning in security is prone to subtle pitfalls that undermine its performance.
We identify common pitfalls in the design, implementation, and evaluation of learning-based security systems.
We propose actionable recommendations to support researchers in avoiding or mitigating the pitfalls where possible.
arXiv Detail & Related papers (2020-10-19T13:09:31Z) - A Privacy-Preserving Distributed Architecture for
Deep-Learning-as-a-Service [68.84245063902908]
This paper introduces a novel distributed architecture for deep-learning-as-a-service.
It is able to preserve the user sensitive data while providing Cloud-based machine and deep learning services.
arXiv Detail & Related papers (2020-03-30T15:12:03Z) - CryptoSPN: Privacy-preserving Sum-Product Network Inference [84.88362774693914]
We present a framework for privacy-preserving inference of sum-product networks (SPNs)
CryptoSPN achieves highly efficient and accurate inference in the order of seconds for medium-sized SPNs.
arXiv Detail & Related papers (2020-02-03T14:49:18Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.