Single-Server Private Linear Transformation: The Joint Privacy Case
- URL: http://arxiv.org/abs/2106.05220v2
- Date: Thu, 10 Jun 2021 01:04:07 GMT
- Title: Single-Server Private Linear Transformation: The Joint Privacy Case
- Authors: Anoosheh Heidarzadeh, Nahid Esmati, and Alex Sprintson
- Abstract summary: This paper introduces the problem of Private Linear Transformation (PLT) which generalizes the problems of private information retrieval and private linear computation.
The problem includes one or more remote server(s) storing (identical copies of) $K$ messages and a user who wants to compute $L$ independent linear combinations of a $D$-subset of messages.
- Score: 10.072633952908456
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: This paper introduces the problem of Private Linear Transformation (PLT)
which generalizes the problems of private information retrieval and private
linear computation. The PLT problem includes one or more remote server(s)
storing (identical copies of) $K$ messages and a user who wants to compute $L$
independent linear combinations of a $D$-subset of messages. The objective of
the user is to perform the computation by downloading minimum possible amount
of information from the server(s), while protecting the identities of the $D$
messages required for the computation. In this work, we focus on the
single-server setting of the PLT problem when the identities of the $D$
messages required for the computation must be protected jointly. We consider
two different models, depending on whether the coefficient matrix of the
required $L$ linear combinations generates a Maximum Distance Separable (MDS)
code. We prove that the capacity for both models is given by $L/(K-D+L)$, where
the capacity is defined as the supremum of all achievable download rates. Our
converse proofs are based on linear-algebraic and information-theoretic
arguments that establish connections between PLT schemes and linear codes. We
also present an achievability scheme for each of the models being considered.
Related papers
- Fast and scalable Wasserstein-1 neural optimal transport solver for single-cell perturbation prediction [55.89763969583124]
Optimal transport theory provides a principled framework for constructing such mappings.
We propose a novel optimal transport solver based on Wasserstein-1.
Our experiments demonstrate that the proposed solver can mimic the $W$ OT solvers in finding a unique and monotonic" map on 2D datasets.
arXiv Detail & Related papers (2024-11-01T14:23:19Z) - Differential Privacy of Cross-Attention with Provable Guarantee [18.331374727331077]
We design a novel differential privacy (DP) data structure to address the privacy security of cross-attention with a theoretical guarantee.
Our result is robust to adaptive queries in which users can intentionally attack the cross-attention system.
arXiv Detail & Related papers (2024-07-20T01:02:27Z) - On Computing Pairwise Statistics with Local Differential Privacy [55.81991984375959]
We study the problem of computing pairwise statistics, i.e., ones of the form $binomn2-1 sum_i ne j f(x_i, x_j)$, where $x_i$ denotes the input to the $i$th user, with differential privacy (DP) in the local model.
This formulation captures important metrics such as Kendall's $tau$ coefficient, Area Under Curve, Gini's mean difference, Gini's entropy, etc.
arXiv Detail & Related papers (2024-06-24T04:06:09Z) - Federated Combinatorial Multi-Agent Multi-Armed Bandits [79.1700188160944]
This paper introduces a federated learning framework tailored for online optimization with bandit.
In this setting, agents subsets of arms, observe noisy rewards for these subsets without accessing individual arm information, and can cooperate and share information at specific intervals.
arXiv Detail & Related papers (2024-05-09T17:40:09Z) - Private Vector Mean Estimation in the Shuffle Model: Optimal Rates Require Many Messages [63.366380571397]
We study the problem of private vector mean estimation in the shuffle model of privacy where $n$ users each have a unit vector $v(i) inmathbbRd$.
We propose a new multi-message protocol that achieves the optimal error using $tildemathcalOleft(min(nvarepsilon2,d)right)$ messages per user.
arXiv Detail & Related papers (2024-04-16T00:56:36Z) - Private Multiple Linear Computation: A Flexible Communication-Computation Tradeoff [27.511481199887978]
We consider the problem of private multiple linear computation (PMLC) over a replicated storage system with colluding and unresponsive constraints.
We propose a novel PMLC scheme to establish a flexible tradeoff between communication costs and computational complexities.
arXiv Detail & Related papers (2024-04-14T07:07:08Z) - Computational-Statistical Gaps in Gaussian Single-Index Models [77.1473134227844]
Single-Index Models are high-dimensional regression problems with planted structure.
We show that computationally efficient algorithms, both within the Statistical Query (SQ) and the Low-Degree Polynomial (LDP) framework, necessarily require $Omega(dkstar/2)$ samples.
arXiv Detail & Related papers (2024-03-08T18:50:19Z) - Distributed Optimization with Feasible Set Privacy [35.16231062731263]
Two agents learn the optimal solution set while keeping their feasible sets $mathcalP1$ and $mathcalP1$ private from each other.
We adopt a sequential private information retrieval (SPIR) framework where one of the agents privately checks in $mathcalP$.
We show that, compared to privately acquiring the feasible set $mathcalP1$ using an SPIR-based private set intersection (PSI) protocol, and finding the optimum, our scheme is better as it incurs less information leakage and less download
arXiv Detail & Related papers (2023-12-04T18:45:04Z) - A Characterization of Optimal-Rate Linear Homomorphic Secret Sharing Schemes, and Applications [13.566464121222225]
Homomorphic Secret Sharing (HSS) scheme is a secret-sharing scheme that shares a secret $x$ among $s$ servers.
A key parameter in HSS schemes is download rate, which quantifies how much information the output client needs to download from each server.
We show that -- perhaps surprisingly -- the set of $ell$'s for which their construction works is in fact nearly optimal.
arXiv Detail & Related papers (2023-11-24T20:40:50Z) - Locally Differentially Private Reinforcement Learning for Linear Mixture
Markov Decision Processes [78.27542864367821]
Reinforcement learning (RL) algorithms can be used to provide personalized services, which rely on users' private and sensitive data.
To protect the users' privacy, privacy-preserving RL algorithms are in demand.
We propose a novel $(varepsilon, delta)$-LDP algorithm for learning a class of Markov decision processes (MDPs) dubbed linear mixture MDPs.
arXiv Detail & Related papers (2021-10-19T17:44:09Z) - Single-Server Private Linear Transformation: The Individual Privacy Case [10.072633952908456]
This paper considers the single-server Private Linear Transformation (PLT) problem with individual privacy guarantees.
The goal is to minimize the download cost while keeping the identity of each message required for the computation individually private.
arXiv Detail & Related papers (2021-06-09T17:12:04Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.