Secure Machine Learning in the Cloud Using One Way Scrambling by
Deconvolution
- URL: http://arxiv.org/abs/2111.03125v1
- Date: Thu, 4 Nov 2021 19:46:41 GMT
- Title: Secure Machine Learning in the Cloud Using One Way Scrambling by
Deconvolution
- Authors: Yiftach Savransky, Roni Mateless, Gilad Katz
- Abstract summary: Cloud-based machine learning services (CMLS) enable organizations to take advantage of advanced models that are pre-trained on large quantities of data.
Asymmetric encryption requires the data to be decrypted in the cloud, while Homomorphic encryption is often too slow and difficult to implement.
We propose One Way Scrambling by Deconvolution (OWSD), a deconvolution-based scrambling framework that offers the advantages of Homomorphic encryption at a fraction of the computational overhead.
- Score: 2.9692754277987286
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Cloud-based machine learning services (CMLS) enable organizations to take
advantage of advanced models that are pre-trained on large quantities of data.
The main shortcoming of using these services, however, is the difficulty of
keeping the transmitted data private and secure. Asymmetric encryption requires
the data to be decrypted in the cloud, while Homomorphic encryption is often
too slow and difficult to implement. We propose One Way Scrambling by
Deconvolution (OWSD), a deconvolution-based scrambling framework that offers
the advantages of Homomorphic encryption at a fraction of the computational
overhead. Extensive evaluation on multiple image datasets demonstrates OWSD's
ability to achieve near-perfect classification performance when the output
vector of the CMLS is sufficiently large. Additionally, we provide empirical
analysis of the robustness of our approach.
Related papers
- Cryptanalysis via Machine Learning Based Information Theoretic Metrics [58.96805474751668]
We propose two novel applications of machine learning (ML) algorithms to perform cryptanalysis on any cryptosystem.
These algorithms can be readily applied in an audit setting to evaluate the robustness of a cryptosystem.
We show that our classification model correctly identifies the encryption schemes that are not IND-CPA secure, such as DES, RSA, and AES ECB, with high accuracy.
arXiv Detail & Related papers (2025-01-25T04:53:36Z) - A Selective Homomorphic Encryption Approach for Faster Privacy-Preserving Federated Learning [2.942616054218564]
Federated learning is a machine learning method that supports training models on decentralized devices or servers.
We propose a new approach that employs selective encryption, homomorphic encryption, differential privacy, and bit-wise scrambling to minimize data leakage.
Our approach is up to 90% faster than applying fully homomorphic encryption on the model weights.
arXiv Detail & Related papers (2025-01-22T14:37:44Z) - Secure Outsourced Decryption for FHE-based Privacy-preserving Cloud Computing [3.125865379632205]
Homomorphic encryption (HE) is one solution for safeguarding data privacy, enabling encrypted data to be processed securely in the cloud.
We propose an outsourced decryption protocol for the prevailing RLWE-based fully homomorphic encryption schemes.
Our experiments demonstrate that the proposed protocol achieves up to a $67%$ acceleration in the client's local decryption, accompanied by a $50%$ reduction in space usage.
arXiv Detail & Related papers (2024-06-28T14:51:36Z) - HETAL: Efficient Privacy-preserving Transfer Learning with Homomorphic Encryption [4.164336621664897]
HETAL is an efficient Homomorphic Encryption based Transfer Learning algorithm.
We propose an encrypted matrix multiplication algorithm, which is 1.8 to 323 times faster than prior methods.
Experiments show total training times of 567-3442 seconds, which is less than an hour.
arXiv Detail & Related papers (2024-03-21T03:47:26Z) - Robust Representation Learning for Privacy-Preserving Machine Learning:
A Multi-Objective Autoencoder Approach [0.9831489366502302]
We propose a robust representation learning framework for privacy-preserving machine learning (ppML)
Our method centers on training autoencoders in a multi-objective manner and then concatenating the latent and learned features from the encoding part as the encoded form of our data.
With our proposed framework, we can share our data and use third party tools without being under the threat of revealing its original form.
arXiv Detail & Related papers (2023-09-08T16:41:25Z) - PEOPL: Characterizing Privately Encoded Open Datasets with Public Labels [59.66777287810985]
We introduce information-theoretic scores for privacy and utility, which quantify the average performance of an unfaithful user.
We then theoretically characterize primitives in building families of encoding schemes that motivate the use of random deep neural networks.
arXiv Detail & Related papers (2023-03-31T18:03:53Z) - THE-X: Privacy-Preserving Transformer Inference with Homomorphic
Encryption [112.02441503951297]
Privacy-preserving inference of transformer models is on the demand of cloud service users.
We introduce $textitTHE-X$, an approximation approach for transformers, which enables privacy-preserving inference of pre-trained models.
arXiv Detail & Related papers (2022-06-01T03:49:18Z) - Reinforcement Learning on Encrypted Data [58.39270571778521]
We present a preliminary, experimental study of how a DQN agent trained on encrypted states performs in environments with discrete and continuous state spaces.
Our results highlight that the agent is still capable of learning in small state spaces even in presence of non-deterministic encryption, but performance collapses in more complex environments.
arXiv Detail & Related papers (2021-09-16T21:59:37Z) - Efficient CNN Building Blocks for Encrypted Data [6.955451042536852]
Homomorphic Encryption (FHE) is a promising technique to enable machine learning and inferencing.
We show that operational parameters of the chosen FHE scheme have a major impact on the design of the machine learning model.
Our empirical study shows that choice of aforementioned design parameters result in significant trade-offs between accuracy, security level, and computational time.
arXiv Detail & Related papers (2021-01-30T21:47:23Z) - Faster Secure Data Mining via Distributed Homomorphic Encryption [108.77460689459247]
Homomorphic Encryption (HE) is receiving more and more attention recently for its capability to do computations over the encrypted field.
We propose a novel general distributed HE-based data mining framework towards one step of solving the scaling problem.
We verify the efficiency and effectiveness of our new framework by testing over various data mining algorithms and benchmark data-sets.
arXiv Detail & Related papers (2020-06-17T18:14:30Z) - A Privacy-Preserving Distributed Architecture for
Deep-Learning-as-a-Service [68.84245063902908]
This paper introduces a novel distributed architecture for deep-learning-as-a-service.
It is able to preserve the user sensitive data while providing Cloud-based machine and deep learning services.
arXiv Detail & Related papers (2020-03-30T15:12:03Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.