Effect of Homomorphic Encryption on the Performance of Training
Federated Learning Generative Adversarial Networks
- URL: http://arxiv.org/abs/2207.00263v1
- Date: Fri, 1 Jul 2022 08:35:10 GMT
- Title: Effect of Homomorphic Encryption on the Performance of Training
Federated Learning Generative Adversarial Networks
- Authors: Ignjat Pejic, Rui Wang, and Kaitai Liang
- Abstract summary: A Generative Adversarial Network (GAN) is a deep-learning generative model in the field of Machine Learning (ML)
In certain fields, such as medicine, the training data may be hospital patient records that are stored across different hospitals.
This paper will focus on the performance loss of training an FL-GAN with three different types of Homomorphic Encryption.
- Score: 10.030986278376567
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: A Generative Adversarial Network (GAN) is a deep-learning generative model in
the field of Machine Learning (ML) that involves training two Neural Networks
(NN) using a sizable data set. In certain fields, such as medicine, the
training data may be hospital patient records that are stored across different
hospitals. The classic centralized approach would involve sending the data to a
centralized server where the model would be trained. However, that would
involve breaching the privacy and confidentiality of the patients and their
data, which would be unacceptable. Therefore, Federated Learning (FL), an ML
technique that trains ML models in a distributed setting without data ever
leaving the host device, would be a better alternative to the centralized
option. In this ML technique, only parameters and certain metadata would be
communicated. In spite of that, there still exist attacks that can infer user
data using the parameters and metadata. A fully privacy-preserving solution
involves homomorphically encrypting (HE) the data communicated. This paper will
focus on the performance loss of training an FL-GAN with three different types
of Homomorphic Encryption: Partial Homomorphic Encryption (PHE), Somewhat
Homomorphic Encryption (SHE), and Fully Homomorphic Encryption (FHE). We will
also test the performance loss of Multi-Party Computations (MPC), as it has
homomorphic properties. The performances will be compared to the performance of
training an FL-GAN without encryption as well. Our experiments show that the
more complex the encryption method is, the longer it takes, with the extra time
taken for HE is quite significant in comparison to the base case of FL.
Related papers
- FedGS: Federated Gradient Scaling for Heterogeneous Medical Image Segmentation [0.4499833362998489]
We propose FedGS, a novel FL aggregation method, to improve segmentation performance on small, under-represented targets.
FedGS demonstrates superior performance over FedAvg, particularly for small lesions, across PolypGen and LiTS datasets.
arXiv Detail & Related papers (2024-08-21T15:26:21Z) - Assessing Neural Network Representations During Training Using
Noise-Resilient Diffusion Spectral Entropy [55.014926694758195]
Entropy and mutual information in neural networks provide rich information on the learning process.
We leverage data geometry to access the underlying manifold and reliably compute these information-theoretic measures.
We show that they form noise-resistant measures of intrinsic dimensionality and relationship strength in high-dimensional simulated data.
arXiv Detail & Related papers (2023-12-04T01:32:42Z) - Robust Representation Learning for Privacy-Preserving Machine Learning:
A Multi-Objective Autoencoder Approach [0.9831489366502302]
We propose a robust representation learning framework for privacy-preserving machine learning (ppML)
Our method centers on training autoencoders in a multi-objective manner and then concatenating the latent and learned features from the encoding part as the encoded form of our data.
With our proposed framework, we can share our data and use third party tools without being under the threat of revealing its original form.
arXiv Detail & Related papers (2023-09-08T16:41:25Z) - PEOPL: Characterizing Privately Encoded Open Datasets with Public Labels [59.66777287810985]
We introduce information-theoretic scores for privacy and utility, which quantify the average performance of an unfaithful user.
We then theoretically characterize primitives in building families of encoding schemes that motivate the use of random deep neural networks.
arXiv Detail & Related papers (2023-03-31T18:03:53Z) - Federated Nearest Neighbor Machine Translation [66.8765098651988]
In this paper, we propose a novel federated nearest neighbor (FedNN) machine translation framework.
FedNN leverages one-round memorization-based interaction to share knowledge across different clients.
Experiments show that FedNN significantly reduces computational and communication costs compared with FedAvg.
arXiv Detail & Related papers (2023-02-23T18:04:07Z) - HE-MAN -- Homomorphically Encrypted MAchine learning with oNnx models [0.23624125155742057]
homomorphic encryption (FHE) is a promising technique to enable individuals using ML services without giving up privacy.
We introduce HE-MAN, an open-source machine learning toolset for privacy preserving inference with ONNX models and homomorphically encrypted data.
Compared to prior work, HE-MAN supports a broad range of ML models in ONNX format out of the box without sacrificing accuracy.
arXiv Detail & Related papers (2023-02-16T12:37:14Z) - Federated Split GANs [12.007429155505767]
We propose an alternative approach to train ML models in user's devices themselves.
We focus on GANs (generative adversarial networks) and leverage their inherent privacy-preserving attribute.
Our system preserves data privacy, keeps a short training time, and yields same accuracy of model training in unconstrained devices.
arXiv Detail & Related papers (2022-07-04T23:53:47Z) - THE-X: Privacy-Preserving Transformer Inference with Homomorphic
Encryption [112.02441503951297]
Privacy-preserving inference of transformer models is on the demand of cloud service users.
We introduce $textitTHE-X$, an approximation approach for transformers, which enables privacy-preserving inference of pre-trained models.
arXiv Detail & Related papers (2022-06-01T03:49:18Z) - Homomorphic Encryption and Federated Learning based Privacy-Preserving
CNN Training: COVID-19 Detection Use-Case [0.41998444721319217]
This paper proposes a privacy-preserving federated learning algorithm for medical data using homomorphic encryption.
The proposed algorithm uses a secure multi-party computation protocol to protect the deep learning model from the adversaries.
arXiv Detail & Related papers (2022-04-16T08:38:35Z) - Acceleration of Federated Learning with Alleviated Forgetting in Local
Training [61.231021417674235]
Federated learning (FL) enables distributed optimization of machine learning models while protecting privacy.
We propose FedReg, an algorithm to accelerate FL with alleviated knowledge forgetting in the local training stage.
Our experiments demonstrate that FedReg not only significantly improves the convergence rate of FL, especially when the neural network architecture is deep.
arXiv Detail & Related papers (2022-03-05T02:31:32Z) - Category-Learning with Context-Augmented Autoencoder [63.05016513788047]
Finding an interpretable non-redundant representation of real-world data is one of the key problems in Machine Learning.
We propose a novel method of using data augmentations when training autoencoders.
We train a Variational Autoencoder in such a way, that it makes transformation outcome predictable by auxiliary network.
arXiv Detail & Related papers (2020-10-10T14:04:44Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.