SoK: Privacy Preserving Machine Learning using Functional Encryption:
Opportunities and Challenges
- URL: http://arxiv.org/abs/2204.05136v1
- Date: Mon, 11 Apr 2022 14:15:36 GMT
- Title: SoK: Privacy Preserving Machine Learning using Functional Encryption:
Opportunities and Challenges
- Authors: Prajwal Panzade and Daniel Takabi
- Abstract summary: We focus on Inner-product-FE and Quadratic-FE-based machine learning models for the privacy-preserving machine learning (PPML) applications.
To the best of our knowledge, this is the first work to systematize FE-based PPML approaches.
- Score: 1.2183405753834562
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: With the advent of functional encryption, new possibilities for computation
on encrypted data have arisen. Functional Encryption enables data owners to
grant third-party access to perform specified computations without disclosing
their inputs. It also provides computation results in plain, unlike Fully
Homomorphic Encryption. The ubiquitousness of machine learning has led to the
collection of massive private data in the cloud computing environment. This
raises potential privacy issues and the need for more private and secure
computing solutions. Numerous efforts have been made in privacy-preserving
machine learning (PPML) to address security and privacy concerns. There are
approaches based on fully homomorphic encryption (FHE), secure multiparty
computation (SMC), and, more recently, functional encryption (FE). However,
FE-based PPML is still in its infancy and has not yet gotten much attention
compared to FHE-based PPML approaches. In this paper, we provide a
systematization of PPML works based on FE summarizing state-of-the-art in the
literature. We focus on Inner-product-FE and Quadratic-FE-based machine
learning models for the PPML applications. We analyze the performance and
usability of the available FE libraries and their applications to PPML. We also
discuss potential directions for FE-based PPML approaches. To the best of our
knowledge, this is the first work to systematize FE-based PPML approaches.
Related papers
- Wildest Dreams: Reproducible Research in Privacy-preserving Neural
Network Training [2.853180143237022]
This work focuses on the ML model's training phase, where maintaining user data privacy is of utmost importance.
We provide a solid theoretical background that eases the understanding of current approaches.
We reproduce results for some of the papers and examine at what level existing works in the field provide support for open science.
arXiv Detail & Related papers (2024-03-06T10:25:36Z) - GuardML: Efficient Privacy-Preserving Machine Learning Services Through
Hybrid Homomorphic Encryption [2.611778281107039]
Privacy-Preserving Machine Learning (PPML) methods have been introduced to safeguard the privacy and security of Machine Learning models.
Modern cryptographic scheme, Hybrid Homomorphic Encryption (HHE) has recently emerged.
We develop and evaluate an HHE-based PPML application for classifying heart disease based on sensitive ECG data.
arXiv Detail & Related papers (2024-01-26T13:12:52Z) - HE-MAN -- Homomorphically Encrypted MAchine learning with oNnx models [0.23624125155742057]
homomorphic encryption (FHE) is a promising technique to enable individuals using ML services without giving up privacy.
We introduce HE-MAN, an open-source machine learning toolset for privacy preserving inference with ONNX models and homomorphically encrypted data.
Compared to prior work, HE-MAN supports a broad range of ML models in ONNX format out of the box without sacrificing accuracy.
arXiv Detail & Related papers (2023-02-16T12:37:14Z) - Differentially Private Deep Q-Learning for Pattern Privacy Preservation
in MEC Offloading [76.0572817182483]
attackers may eavesdrop on the offloading decisions to infer the edge server's (ES's) queue information and users' usage patterns.
We propose an offloading strategy which jointly minimizes the latency, ES's energy consumption, and task dropping rate, while preserving pattern privacy (PP)
We develop a Differential Privacy Deep Q-learning based Offloading (DP-DQO) algorithm to solve this problem while addressing the PP issue by injecting noise into the generated offloading decisions.
arXiv Detail & Related papers (2023-02-09T12:50:18Z) - Is Vertical Logistic Regression Privacy-Preserving? A Comprehensive
Privacy Analysis and Beyond [57.10914865054868]
We consider vertical logistic regression (VLR) trained with mini-batch descent gradient.
We provide a comprehensive and rigorous privacy analysis of VLR in a class of open-source Federated Learning frameworks.
arXiv Detail & Related papers (2022-07-19T05:47:30Z) - THE-X: Privacy-Preserving Transformer Inference with Homomorphic
Encryption [112.02441503951297]
Privacy-preserving inference of transformer models is on the demand of cloud service users.
We introduce $textitTHE-X$, an approximation approach for transformers, which enables privacy-preserving inference of pre-trained models.
arXiv Detail & Related papers (2022-06-01T03:49:18Z) - Reinforcement Learning on Encrypted Data [58.39270571778521]
We present a preliminary, experimental study of how a DQN agent trained on encrypted states performs in environments with discrete and continuous state spaces.
Our results highlight that the agent is still capable of learning in small state spaces even in presence of non-deterministic encryption, but performance collapses in more complex environments.
arXiv Detail & Related papers (2021-09-16T21:59:37Z) - Privacy-Preserving Machine Learning: Methods, Challenges and Directions [4.711430413139393]
Well-designed privacy-preserving machine learning (PPML) solutions have attracted increasing research interest from academia and industry.
This paper systematically reviews existing privacy-preserving approaches and proposes a PGU model to guide evaluation for various PPML solutions.
arXiv Detail & Related papers (2021-08-10T02:58:31Z) - Privacy-Preserving XGBoost Inference [0.6345523830122165]
A major barrier to adoption is the sensitive nature of predictive queries.
One central goal of privacy-preserving machine learning (PPML) is to enable users to submit encrypted queries to a remote ML service.
We propose a privacy-preserving XGBoost prediction algorithm, which we have implemented and evaluated empirically on AWS SageMaker.
arXiv Detail & Related papers (2020-11-09T21:46:07Z) - Machine Learning Force Fields [54.48599172620472]
Machine Learning (ML) has enabled numerous advances in computational chemistry.
One of the most promising applications is the construction of ML-based force fields (FFs)
This review gives an overview of applications of ML-FFs and the chemical insights that can be obtained from them.
arXiv Detail & Related papers (2020-10-14T13:14:14Z) - CryptoSPN: Privacy-preserving Sum-Product Network Inference [84.88362774693914]
We present a framework for privacy-preserving inference of sum-product networks (SPNs)
CryptoSPN achieves highly efficient and accurate inference in the order of seconds for medium-sized SPNs.
arXiv Detail & Related papers (2020-02-03T14:49:18Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.