A brief history on Homomorphic learning: A privacy-focused approach to
machine learning
- URL: http://arxiv.org/abs/2009.04587v2
- Date: Fri, 11 Sep 2020 02:14:16 GMT
- Title: A brief history on Homomorphic learning: A privacy-focused approach to
machine learning
- Authors: Aadesh Neupane
- Abstract summary: Homomorphic encryption allows running arbitrary operations on encrypted data.
It enables us to run any sophisticated machine learning algorithm without access to the underlying raw data.
It took more than 30 years of collective effort to finally find the answer "yes"
- Score: 2.055949720959582
- License: http://creativecommons.org/publicdomain/zero/1.0/
- Abstract: Cryptography and data science research grew exponential with the internet
boom. Legacy encryption techniques force users to make a trade-off between
usability, convenience, and security. Encryption makes valuable data
inaccessible, as it needs to be decrypted each time to perform any operation.
Billions of dollars could be saved, and millions of people could benefit from
cryptography methods that don't compromise between usability, convenience, and
security. Homomorphic encryption is one such paradigm that allows running
arbitrary operations on encrypted data. It enables us to run any sophisticated
machine learning algorithm without access to the underlying raw data. Thus,
homomorphic learning provides the ability to gain insights from sensitive data
that has been neglected due to various governmental and organization privacy
rules.
In this paper, we trace back the ideas of homomorphic learning formally posed
by Ronald L. Rivest and Len Alderman as "Can we compute upon encrypted data?"
in their 1978 paper. Then we gradually follow the ideas sprouting in the
brilliant minds of Shafi Goldwasser, Kristin Lauter, Dan Bonch, Tomas Sander,
Donald Beaver, and Craig Gentry to address that vital question. It took more
than 30 years of collective effort to finally find the answer "yes" to that
important question.
Related papers
- Simultaneous Haar Indistinguishability with Applications to Unclonable Cryptography [5.360892674012226]
We present a new approach to unclonable encryption via a reduction to a novel question about nonlocal quantum state discrimination.
Our main technical result is showing that the players cannot distinguish between each player receiving independently-chosen Haar random states versus all players receiving the same Haar random state.
We also show other implications to single-decryptor encryption and leakage-resilient secret sharing.
arXiv Detail & Related papers (2024-05-16T17:30:55Z) - Lightweight Public Key Encryption in Post-Quantum Computing Era [0.0]
Confidentiality in our digital world is based on the security of cryptographic algorithms.
In the course of technological progress with quantum computers, the protective function of common encryption algorithms is threatened.
Our concept describes the transformation of a classical asymmetric encryption method to a modern complexity class.
arXiv Detail & Related papers (2023-11-24T21:06:42Z) - Learning in the Dark: Privacy-Preserving Machine Learning using Function Approximation [1.8907108368038215]
Learning in the Dark is a privacy-preserving machine learning model that can classify encrypted images with high accuracy.
It is capable of performing high accuracy predictions by performing computations directly on encrypted data.
arXiv Detail & Related papers (2023-09-15T06:45:58Z) - Human-imperceptible, Machine-recognizable Images [76.01951148048603]
A major conflict is exposed relating to software engineers between better developing AI systems and distancing from the sensitive training data.
This paper proposes an efficient privacy-preserving learning paradigm, where images are encrypted to become human-imperceptible, machine-recognizable''
We show that the proposed paradigm can ensure the encrypted images have become human-imperceptible while preserving machine-recognizable information.
arXiv Detail & Related papers (2023-06-06T13:41:37Z) - RiDDLE: Reversible and Diversified De-identification with Latent
Encryptor [57.66174700276893]
This work presents RiDDLE, short for Reversible and Diversified De-identification with Latent Encryptor.
Built upon a pre-learned StyleGAN2 generator, RiDDLE manages to encrypt and decrypt the facial identity within the latent space.
arXiv Detail & Related papers (2023-03-09T11:03:52Z) - Revocable Cryptography from Learning with Errors [61.470151825577034]
We build on the no-cloning principle of quantum mechanics and design cryptographic schemes with key-revocation capabilities.
We consider schemes where secret keys are represented as quantum states with the guarantee that, once the secret key is successfully revoked from a user, they no longer have the ability to perform the same functionality as before.
arXiv Detail & Related papers (2023-02-28T18:58:11Z) - Privacy-Preserving Chaotic Extreme Learning Machine with Fully
Homomorphic Encryption [5.010425616264462]
We propose a Chaotic Extreme Learning Machine and its encrypted form using Fully Homomorphic Encryption.
Our proposed method has performed either better or similar to the Traditional Extreme Learning Machine on most of the datasets.
arXiv Detail & Related papers (2022-08-04T11:29:52Z) - THE-X: Privacy-Preserving Transformer Inference with Homomorphic
Encryption [112.02441503951297]
Privacy-preserving inference of transformer models is on the demand of cloud service users.
We introduce $textitTHE-X$, an approximation approach for transformers, which enables privacy-preserving inference of pre-trained models.
arXiv Detail & Related papers (2022-06-01T03:49:18Z) - NeuraCrypt: Hiding Private Health Data via Random Neural Networks for
Public Training [64.54200987493573]
We propose NeuraCrypt, a private encoding scheme based on random deep neural networks.
NeuraCrypt encodes raw patient data using a randomly constructed neural network known only to the data-owner.
We show that NeuraCrypt achieves competitive accuracy to non-private baselines on a variety of x-ray tasks.
arXiv Detail & Related papers (2021-06-04T13:42:21Z) - Homomorphically Encrypted Linear Contextual Bandit [39.5858373448478]
Contextual bandit is a framework for online learning in sequential decision-making problems.
We introduce a privacy-preserving bandit framework based on asymmetric encryption.
We show that despite the complexity of the setting, it is possible to learn over encrypted data.
arXiv Detail & Related papers (2021-03-17T21:49:21Z) - Faster Secure Data Mining via Distributed Homomorphic Encryption [108.77460689459247]
Homomorphic Encryption (HE) is receiving more and more attention recently for its capability to do computations over the encrypted field.
We propose a novel general distributed HE-based data mining framework towards one step of solving the scaling problem.
We verify the efficiency and effectiveness of our new framework by testing over various data mining algorithms and benchmark data-sets.
arXiv Detail & Related papers (2020-06-17T18:14:30Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.