High Performance Logistic Regression for Privacy-Preserving Genome
Analysis
- URL: http://arxiv.org/abs/2002.05377v2
- Date: Tue, 3 Mar 2020 11:00:01 GMT
- Title: High Performance Logistic Regression for Privacy-Preserving Genome
Analysis
- Authors: Martine De Cock and Rafael Dowsley and Anderson C. A. Nascimento and
Davis Railsback and Jianwei Shen and Ariel Todoki
- Abstract summary: We present a secure logistic regression training protocol and its implementation, with a new subprotocol to securely compute the activation function.
We present the fastest existing secure Multi-Party Computation implementation for training logistic regression models on high dimensional genome data distributed across a local area network.
- Score: 15.078027648304117
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In this paper, we present a secure logistic regression training protocol and
its implementation, with a new subprotocol to securely compute the activation
function. To the best of our knowledge, we present the fastest existing secure
Multi-Party Computation implementation for training logistic regression models
on high dimensional genome data distributed across a local area network.
Related papers
- LFFR: Logistic Function For (multi-output) Regression [0.0]
We build upon previous work on privacy-preserving regression to address multi-output regression problems.
We adapt our novel LFFR algorithm, initially designed for single-output logistic regression, to handle multiple outputs.
Evaluations on multiple real-world datasets demonstrate the effectiveness of our multi-output LFFR algorithm.
arXiv Detail & Related papers (2024-07-30T20:52:38Z) - Q-value Regularized Transformer for Offline Reinforcement Learning [70.13643741130899]
We propose a Q-value regularized Transformer (QT) to enhance the state-of-the-art in offline reinforcement learning (RL)
QT learns an action-value function and integrates a term maximizing action-values into the training loss of Conditional Sequence Modeling (CSM)
Empirical evaluations on D4RL benchmark datasets demonstrate the superiority of QT over traditional DP and CSM methods.
arXiv Detail & Related papers (2024-05-27T12:12:39Z) - Prevalidated ridge regression is a highly-efficient drop-in replacement
for logistic regression for high-dimensional data [7.532661545437305]
We present a prevalidated ridge regression model that matches logistic regression in terms of classification error and log-loss.
We scale the coefficients of the model so as to minimise log-loss for a set of prevalidated predictions.
This exploits quantities already computed in the course of fitting the ridge regression model in order to find the scaling parameter with nominal additional computational expense.
arXiv Detail & Related papers (2024-01-28T09:38:14Z) - TRIAGE: Characterizing and auditing training data for improved
regression [80.11415390605215]
We introduce TRIAGE, a novel data characterization framework tailored to regression tasks and compatible with a broad class of regressors.
TRIAGE utilizes conformal predictive distributions to provide a model-agnostic scoring method, the TRIAGE score.
We show that TRIAGE's characterization is consistent and highlight its utility to improve performance via data sculpting/filtering, in multiple regression settings.
arXiv Detail & Related papers (2023-10-29T10:31:59Z) - Online Efficient Secure Logistic Regression based on Function Secret Sharing [15.764294489590041]
We propose an online efficient protocol for privacy-preserving logistic regression based on Function Secret Sharing (FSS)
Our protocols are designed in the two non-colluding servers setting and assume the existence of a third-party dealer.
We propose accurate and MPC-friendly alternatives to the sigmoid function and encapsulate the logistic regression training process into a function secret sharing gate.
arXiv Detail & Related papers (2023-09-18T04:50:54Z) - Tackling Computational Heterogeneity in FL: A Few Theoretical Insights [68.8204255655161]
We introduce and analyse a novel aggregation framework that allows for formalizing and tackling computational heterogeneous data.
Proposed aggregation algorithms are extensively analyzed from a theoretical, and an experimental prospective.
arXiv Detail & Related papers (2023-07-12T16:28:21Z) - Optimization of a Hydrodynamic Computational Reservoir through Evolution [58.720142291102135]
We interface with a model of a hydrodynamic system, under development by a startup, as a computational reservoir.
We optimized the readout times and how inputs are mapped to the wave amplitude or frequency using an evolutionary search algorithm.
Applying evolutionary methods to this reservoir system substantially improved separability on an XNOR task, in comparison to implementations with hand-selected parameters.
arXiv Detail & Related papers (2023-04-20T19:15:02Z) - Efficient Bayesian Updates for Deep Learning via Laplace Approximations [1.5996841879821277]
We propose a novel Bayesian update method for deep neural networks.
We leverage second-order optimization techniques on the Gaussian posterior distribution of a Laplace approximation.
A large-scale evaluation study confirms that our updates are a fast and competitive alternative to costly retraining.
arXiv Detail & Related papers (2022-10-12T12:16:46Z) - APS: Active Pretraining with Successor Features [96.24533716878055]
We show that by reinterpreting and combining successorcitepHansenFast with non entropy, the intractable mutual information can be efficiently optimized.
The proposed method Active Pretraining with Successor Feature (APS) explores the environment via non entropy, and the explored data can be efficiently leveraged to learn behavior.
arXiv Detail & Related papers (2021-08-31T16:30:35Z) - Privacy-preserving Logistic Regression with Secret Sharing [0.0]
We propose secret sharing-based privacy-preserving logistic regression protocols using the Newton-Raphson method.
Our implementation results show that our improved method can handle large datasets used in securely training a logistic regression from multiple sources.
arXiv Detail & Related papers (2021-05-14T14:53:50Z) - Real-Time Regression with Dividing Local Gaussian Processes [62.01822866877782]
Local Gaussian processes are a novel, computationally efficient modeling approach based on Gaussian process regression.
Due to an iterative, data-driven division of the input space, they achieve a sublinear computational complexity in the total number of training points in practice.
A numerical evaluation on real-world data sets shows their advantages over other state-of-the-art methods in terms of accuracy as well as prediction and update speed.
arXiv Detail & Related papers (2020-06-16T18:43:31Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.