Privacy in networks of quantum sensors
- URL: http://arxiv.org/abs/2408.01711v1
- Date: Sat, 3 Aug 2024 08:39:44 GMT
- Title: Privacy in networks of quantum sensors
- Authors: Majid Hassani, Santiago Scheiner, Matteo G. A. Paris, Damian Markham,
- Abstract summary: We develop an analysis of privacy in terms of a manipulation of the quantum Fisher information matrix.
We find the optimal state achieving maximum privacy in the estimation of linear combination of the unknown parameters in a network of quantum sensors.
- Score: 1.2499537119440245
- License: http://creativecommons.org/publicdomain/zero/1.0/
- Abstract: We treat privacy in a network of quantum sensors where accessible information is limited to specific functions of the network parameters, and all other information remains private. We develop an analysis of privacy in terms of a manipulation of the quantum Fisher information matrix, and find the optimal state achieving maximum privacy in the estimation of linear combination of the unknown parameters in a network of quantum sensors. We also discuss the effect of uncorrelated noise on the privacy of the network. Moreover, we illustrate our results with an example where the goal is to estimate the average value of the unknown parameters in the network. In this example, we also introduce the notion of quasi-privacy ($\epsilon$-privacy), quantifying how close the state is to being private.
Related papers
- Collaborative Inference over Wireless Channels with Feature Differential Privacy [57.68286389879283]
Collaborative inference among multiple wireless edge devices has the potential to significantly enhance Artificial Intelligence (AI) applications.
transmitting extracted features poses a significant privacy risk, as sensitive personal data can be exposed during the process.
We propose a novel privacy-preserving collaborative inference mechanism, wherein each edge device in the network secures the privacy of extracted features before transmitting them to a central server for inference.
arXiv Detail & Related papers (2024-10-25T18:11:02Z) - Private and Robust States for Distributed Quantum Sensing [1.2499537119440245]
Distributed quantum sensing enables the estimation of multiple parameters encoded in spatially separated probes.
In such settings it is natural to not want to give away more information than is necessary.
We use the concept of privacy with respect to a function, ensuring that only information about the target function is available to all the parties.
arXiv Detail & Related papers (2024-07-31T15:46:50Z) - The Effect of Quantization in Federated Learning: A Rényi Differential Privacy Perspective [15.349042342071439]
Federated Learning (FL) is an emerging paradigm that holds great promise for privacy-preserving machine learning using distributed data.
To enhance privacy, FL can be combined with Differential Privacy (DP), which involves adding Gaussian noise to the model weights.
This research paper investigates the impact of quantization on privacy in FL systems.
arXiv Detail & Related papers (2024-05-16T13:50:46Z) - Differential Privacy Preserving Quantum Computing via Projection Operator Measurements [15.024190374248088]
In classical computing, we can incorporate the concept of differential privacy (DP) to meet the standard of privacy preservation.
In the quantum computing scenario, researchers have extended classic DP to quantum differential privacy (QDP) by considering the quantum noise.
We show that shot noise can effectively provide privacy protection in quantum computing.
arXiv Detail & Related papers (2023-12-13T15:27:26Z) - Initialization Matters: Privacy-Utility Analysis of Overparameterized
Neural Networks [72.51255282371805]
We prove a privacy bound for the KL divergence between model distributions on worst-case neighboring datasets.
We find that this KL privacy bound is largely determined by the expected squared gradient norm relative to model parameters during training.
arXiv Detail & Related papers (2023-10-31T16:13:22Z) - A Unified View of Differentially Private Deep Generative Modeling [60.72161965018005]
Data with privacy concerns comes with stringent regulations that frequently prohibited data access and data sharing.
Overcoming these obstacles is key for technological progress in many real-world application scenarios that involve privacy sensitive data.
Differentially private (DP) data publishing provides a compelling solution, where only a sanitized form of the data is publicly released.
arXiv Detail & Related papers (2023-09-27T14:38:16Z) - A unifying framework for differentially private quantum algorithms [0.0]
We propose a novel and general definition of neighbouring quantum states.
We demonstrate that this definition captures the underlying structure of quantum encodings.
We also investigate an alternative setting where we are provided with multiple copies of the input state.
arXiv Detail & Related papers (2023-07-10T17:44:03Z) - Private network parameter estimation with quantum sensors [0.0]
We introduce a protocol to securely evaluate linear functions of parameters over a network of quantum sensors.
This has application to secure networks of clocks and opens the door to more general applications of secure multiparty computing.
arXiv Detail & Related papers (2022-07-29T03:07:17Z) - Partial sensitivity analysis in differential privacy [58.730520380312676]
We investigate the impact of each input feature on the individual's privacy loss.
We experimentally evaluate our approach on queries over private databases.
We also explore our findings in the context of neural network training on synthetic data.
arXiv Detail & Related papers (2021-09-22T08:29:16Z) - Robustness Threats of Differential Privacy [70.818129585404]
We experimentally demonstrate that networks, trained with differential privacy, in some settings might be even more vulnerable in comparison to non-private versions.
We study how the main ingredients of differentially private neural networks training, such as gradient clipping and noise addition, affect the robustness of the model.
arXiv Detail & Related papers (2020-12-14T18:59:24Z) - Applications of Differential Privacy in Social Network Analysis: A
Survey [60.696428840516724]
Differential privacy is effective in sharing information and preserving privacy with a strong guarantee.
Social network analysis has been extensively adopted in many applications, opening a new arena for the application of differential privacy.
arXiv Detail & Related papers (2020-10-06T19:06:03Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.