Private and Robust States for Distributed Quantum Sensing
- URL: http://arxiv.org/abs/2407.21701v1
- Date: Wed, 31 Jul 2024 15:46:50 GMT
- Title: Private and Robust States for Distributed Quantum Sensing
- Authors: Luís Bugalho, Majid Hassani, Yasser Omar, Damian Markham,
- Abstract summary: Distributed quantum sensing enables the estimation of multiple parameters encoded in spatially separated probes.
In such settings it is natural to not want to give away more information than is necessary.
We use the concept of privacy with respect to a function, ensuring that only information about the target function is available to all the parties.
- Score: 1.2499537119440245
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Distributed quantum sensing enables the estimation of multiple parameters encoded in spatially separated probes. While traditional quantum sensing is often focused on estimating a single parameter with maximum precision, distributed quantum sensing seeks to estimate some function of multiple parameters that are only locally accessible for each party involved. In such settings it is natural to not want to give away more information than is necessary. To address this, we use the concept of privacy with respect to a function, ensuring that only information about the target function is available to all the parties, and no other information. We define a measure of privacy (essentially how close we are to this condition being satisfied), and show it satisfies a set of naturally desirable properties of such a measure. Using this privacy measure, we identify and construct entangled resources states that ensure privacy for a given function under different resource distributions and encoding dynamics, characterized by Hamiltonian evolution. For separable and parallel Hamiltonians, we prove that the GHZ state is the only private state for certain linear functions, with the minimum amount of required resources, up to SLOCC. Recognizing the vulnerability of this state to particle loss, we create families of private states, that remain robust even against loss of qubits, by incorporating additional resources. We then extend our findings to different resource distribution scenarios and Hamiltonians, resulting in a comprehensive set of private and robust states for distributed quantum estimation. These results advance the understanding of privacy and robustness in multi-parameter quantum sensing.
Related papers
- Privacy in networks of quantum sensors [1.2499537119440245]
We develop an analysis of privacy in terms of a manipulation of the quantum Fisher information matrix.
We find the optimal state achieving maximum privacy in the estimation of linear combination of the unknown parameters in a network of quantum sensors.
arXiv Detail & Related papers (2024-08-03T08:39:44Z) - Unified Mechanism-Specific Amplification by Subsampling and Group Privacy Amplification [54.1447806347273]
Amplification by subsampling is one of the main primitives in machine learning with differential privacy.
We propose the first general framework for deriving mechanism-specific guarantees.
We analyze how subsampling affects the privacy of groups of multiple users.
arXiv Detail & Related papers (2024-03-07T19:36:05Z) - A unifying framework for differentially private quantum algorithms [0.0]
We propose a novel and general definition of neighbouring quantum states.
We demonstrate that this definition captures the underlying structure of quantum encodings.
We also investigate an alternative setting where we are provided with multiple copies of the input state.
arXiv Detail & Related papers (2023-07-10T17:44:03Z) - Quantum Pufferfish Privacy: A Flexible Privacy Framework for Quantum Systems [19.332726520752846]
We propose a versatile privacy framework for quantum systems, termed quantum pufferfish privacy (QPP)
Inspired by classical pufferfish privacy, our formulation generalizes and addresses limitations of quantum differential privacy.
We show that QPP can be equivalently formulated in terms of the Datta-Leditzky information spectrum divergence.
arXiv Detail & Related papers (2023-06-22T17:21:17Z) - Breaking the Communication-Privacy-Accuracy Tradeoff with
$f$-Differential Privacy [51.11280118806893]
We consider a federated data analytics problem in which a server coordinates the collaborative data analysis of multiple users with privacy concerns and limited communication capability.
We study the local differential privacy guarantees of discrete-valued mechanisms with finite output space through the lens of $f$-differential privacy (DP)
More specifically, we advance the existing literature by deriving tight $f$-DP guarantees for a variety of discrete-valued mechanisms.
arXiv Detail & Related papers (2023-02-19T16:58:53Z) - Algorithms with More Granular Differential Privacy Guarantees [65.3684804101664]
We consider partial differential privacy (DP), which allows quantifying the privacy guarantee on a per-attribute basis.
In this work, we study several basic data analysis and learning tasks, and design algorithms whose per-attribute privacy parameter is smaller that the best possible privacy parameter for the entire record of a person.
arXiv Detail & Related papers (2022-09-08T22:43:50Z) - Private measures, random walks, and synthetic data [7.5764890276775665]
Differential privacy is a mathematical concept that provides an information-theoretic security guarantee.
We develop a private measure from a data set that allows us to efficiently construct private synthetic data.
A key ingredient in our construction is a new superregular random walk, whose joint distribution of steps is as regular as that of independent random variables.
arXiv Detail & Related papers (2022-04-20T00:06:52Z) - Quantum Differential Privacy: An Information Theory Perspective [2.9005223064604073]
We discuss differential privacy in an information theoretic framework by casting it as a quantum divergence.
A main advantage of this approach is that differential privacy becomes a property solely based on the output states of the computation.
arXiv Detail & Related papers (2022-02-22T08:12:50Z) - Robustness Threats of Differential Privacy [70.818129585404]
We experimentally demonstrate that networks, trained with differential privacy, in some settings might be even more vulnerable in comparison to non-private versions.
We study how the main ingredients of differentially private neural networks training, such as gradient clipping and noise addition, affect the robustness of the model.
arXiv Detail & Related papers (2020-12-14T18:59:24Z) - Graph-Homomorphic Perturbations for Private Decentralized Learning [64.26238893241322]
Local exchange of estimates allows inference of data based on private data.
perturbations chosen independently at every agent, resulting in a significant performance loss.
We propose an alternative scheme, which constructs perturbations according to a particular nullspace condition, allowing them to be invisible.
arXiv Detail & Related papers (2020-10-23T10:35:35Z) - Differential Privacy of Hierarchical Census Data: An Optimization
Approach [53.29035917495491]
Census Bureaus are interested in releasing aggregate socio-economic data about a large population without revealing sensitive information about any individual.
Recent events have identified some of the privacy challenges faced by these organizations.
This paper presents a novel differential-privacy mechanism for releasing hierarchical counts of individuals.
arXiv Detail & Related papers (2020-06-28T18:19:55Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.