Additively Homomorphical Encryption based Deep Neural Network for
Asymmetrically Collaborative Machine Learning
- URL: http://arxiv.org/abs/2007.06849v1
- Date: Tue, 14 Jul 2020 06:43:25 GMT
- Title: Additively Homomorphical Encryption based Deep Neural Network for
Asymmetrically Collaborative Machine Learning
- Authors: Yifei Zhang and Hao Zhu
- Abstract summary: preserving machine learning creates a constraint which limits further applications in finance sectors.
We propose a new practical scheme of collaborative machine learning that one party owns data, but another party owns labels only.
Our experiments on different datasets demonstrate not only stable training without accuracy, but also more than 100 times speedup.
- Score: 12.689643742151516
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The financial sector presents many opportunities to apply various machine
learning techniques. Centralized machine learning creates a constraint which
limits further applications in finance sectors. Data privacy is a fundamental
challenge for a variety of finance and insurance applications that account on
learning a model across different sections. In this paper, we define a new
practical scheme of collaborative machine learning that one party owns data,
but another party owns labels only, and term this \textbf{Asymmetrically
Collaborative Machine Learning}. For this scheme, we propose a novel
privacy-preserving architecture where two parties can collaboratively train a
deep learning model efficiently while preserving the privacy of each party's
data. More specifically, we decompose the forward propagation and
backpropagation of the neural network into four different steps and propose a
novel protocol to handle information leakage in these steps. Our extensive
experiments on different datasets demonstrate not only stable training without
accuracy loss, but also more than 100 times speedup compared with the
state-of-the-art system.
Related papers
- Privacy-Preserving Graph Machine Learning from Data to Computation: A
Survey [67.7834898542701]
We focus on reviewing privacy-preserving techniques of graph machine learning.
We first review methods for generating privacy-preserving graph data.
Then we describe methods for transmitting privacy-preserved information.
arXiv Detail & Related papers (2023-07-10T04:30:23Z) - Scalable Collaborative Learning via Representation Sharing [53.047460465980144]
Federated learning (FL) and Split Learning (SL) are two frameworks that enable collaborative learning while keeping the data private (on device)
In FL, each data holder trains a model locally and releases it to a central server for aggregation.
In SL, the clients must release individual cut-layer activations (smashed data) to the server and wait for its response (during both inference and back propagation).
In this work, we present a novel approach for privacy-preserving machine learning, where the clients collaborate via online knowledge distillation using a contrastive loss.
arXiv Detail & Related papers (2022-11-20T10:49:22Z) - Privacy-Preserving Machine Learning for Collaborative Data Sharing via
Auto-encoder Latent Space Embeddings [57.45332961252628]
Privacy-preserving machine learning in data-sharing processes is an ever-critical task.
This paper presents an innovative framework that uses Representation Learning via autoencoders to generate privacy-preserving embedded data.
arXiv Detail & Related papers (2022-11-10T17:36:58Z) - Practical Vertical Federated Learning with Unsupervised Representation
Learning [47.77625754666018]
Federated learning enables multiple parties to collaboratively train a machine learning model without sharing their raw data.
We propose a novel communication-efficient vertical federated learning algorithm named FedOnce, which requires only one-shot communication among parties.
Our privacy-preserving technique significantly outperforms the state-of-the-art approaches under the same privacy budget.
arXiv Detail & Related papers (2022-08-13T08:41:32Z) - Privacy-Preserving Chaotic Extreme Learning Machine with Fully
Homomorphic Encryption [5.010425616264462]
We propose a Chaotic Extreme Learning Machine and its encrypted form using Fully Homomorphic Encryption.
Our proposed method has performed either better or similar to the Traditional Extreme Learning Machine on most of the datasets.
arXiv Detail & Related papers (2022-08-04T11:29:52Z) - Efficient Differentially Private Secure Aggregation for Federated
Learning via Hardness of Learning with Errors [1.4680035572775534]
Federated machine learning leverages edge computing to develop models from network user data.
Privacy in federated learning remains a major challenge.
Recent advances in emphsecure aggregation using multiparty computation eliminate the need for a third party.
We present a new federated learning protocol that leverages a novel differentially private, malicious secure aggregation protocol.
arXiv Detail & Related papers (2021-12-13T18:31:08Z) - Non-IID data and Continual Learning processes in Federated Learning: A
long road ahead [58.720142291102135]
Federated Learning is a novel framework that allows multiple devices or institutions to train a machine learning model collaboratively while preserving their data private.
In this work, we formally classify data statistical heterogeneity and review the most remarkable learning strategies that are able to face it.
At the same time, we introduce approaches from other machine learning frameworks, such as Continual Learning, that also deal with data heterogeneity and could be easily adapted to the Federated Learning settings.
arXiv Detail & Related papers (2021-11-26T09:57:11Z) - MORSE-STF: A Privacy Preserving Computation System [12.875477499515158]
We present Secure-TF, a privacy-preserving machine learning framework based on MPC.
Our framework is able to support widely-used machine learning models such as logistic regression, fully-connected neural network, and convolutional neural network.
arXiv Detail & Related papers (2021-09-24T03:42:46Z) - Quasi-Global Momentum: Accelerating Decentralized Deep Learning on
Heterogeneous Data [77.88594632644347]
Decentralized training of deep learning models is a key element for enabling data privacy and on-device learning over networks.
In realistic learning scenarios, the presence of heterogeneity across different clients' local datasets poses an optimization challenge.
We propose a novel momentum-based method to mitigate this decentralized training difficulty.
arXiv Detail & Related papers (2021-02-09T11:27:14Z) - Concentrated Differentially Private and Utility Preserving Federated
Learning [24.239992194656164]
Federated learning is a machine learning setting where a set of edge devices collaboratively train a model under the orchestration of a central server.
In this paper, we develop a federated learning approach that addresses the privacy challenge without much degradation on model utility.
We provide a tight end-to-end privacy guarantee of our approach and analyze its theoretical convergence rates.
arXiv Detail & Related papers (2020-03-30T19:20:42Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.