Privacy-Aware Time-Series Data Sharing with Deep Reinforcement Learning
- URL: http://arxiv.org/abs/2003.02685v2
- Date: Tue, 23 Jun 2020 11:52:41 GMT
- Title: Privacy-Aware Time-Series Data Sharing with Deep Reinforcement Learning
- Authors: Ecenaz Erdemir, Pier Luigi Dragotti and Deniz Gunduz
- Abstract summary: We study the privacy-utility trade-off (PUT) in time-series data sharing.
Methods that preserve the privacy for the current time may leak significant amount of information at the trace level.
We consider sharing the distorted version of a user's true data sequence with an untrusted third party.
- Score: 33.42328078385098
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Internet of things (IoT) devices are becoming increasingly popular thanks to
many new services and applications they offer. However, in addition to their
many benefits, they raise privacy concerns since they share fine-grained
time-series user data with untrusted third parties. In this work, we study the
privacy-utility trade-off (PUT) in time-series data sharing. Existing
approaches to PUT mainly focus on a single data point; however, temporal
correlations in time-series data introduce new challenges. Methods that
preserve the privacy for the current time may leak significant amount of
information at the trace level as the adversary can exploit temporal
correlations in a trace. We consider sharing the distorted version of a user's
true data sequence with an untrusted third party. We measure the privacy
leakage by the mutual information between the user's true data sequence and
shared version. We consider both the instantaneous and average distortion
between the two sequences, under a given distortion measure, as the utility
loss metric. To tackle the history-dependent mutual information minimization,
we reformulate the problem as a Markov decision process (MDP), and solve it
using asynchronous actor-critic deep reinforcement learning (RL). We evaluate
the performance of the proposed solution in location trace privacy on both
synthetic and GeoLife GPS trajectory datasets. For the latter, we show the
validity of our solution by testing the privacy of the released location
trajectory against an adversary network.
Related papers
- Differentially Private Data Release on Graphs: Inefficiencies and Unfairness [48.96399034594329]
This paper characterizes the impact of Differential Privacy on bias and unfairness in the context of releasing information about networks.
We consider a network release problem where the network structure is known to all, but the weights on edges must be released privately.
Our work provides theoretical foundations and empirical evidence into the bias and unfairness arising due to privacy in these networked decision problems.
arXiv Detail & Related papers (2024-08-08T08:37:37Z) - PeFAD: A Parameter-Efficient Federated Framework for Time Series Anomaly Detection [51.20479454379662]
We propose a.
Federated Anomaly Detection framework named PeFAD with the increasing privacy concerns.
We conduct extensive evaluations on four real datasets, where PeFAD outperforms existing state-of-the-art baselines by up to 28.74%.
arXiv Detail & Related papers (2024-06-04T13:51:08Z) - RASE: Efficient Privacy-preserving Data Aggregation against Disclosure Attacks for IoTs [2.1765174838950494]
We study the new paradigm for collecting and protecting the data produced by ever-increasing sensor devices.
Most previous studies on co-design of data aggregation and privacy preservation assume that a trusted fusion center adheres to privacy regimes.
We propose a novel paradigm (called RASE), which can be generalized into a 3-step sequential procedure, noise addition, followed by random permutation, and then parameter estimation.
arXiv Detail & Related papers (2024-05-31T15:21:38Z) - A Unified View of Differentially Private Deep Generative Modeling [60.72161965018005]
Data with privacy concerns comes with stringent regulations that frequently prohibited data access and data sharing.
Overcoming these obstacles is key for technological progress in many real-world application scenarios that involve privacy sensitive data.
Differentially private (DP) data publishing provides a compelling solution, where only a sanitized form of the data is publicly released.
arXiv Detail & Related papers (2023-09-27T14:38:16Z) - TeD-SPAD: Temporal Distinctiveness for Self-supervised
Privacy-preservation for video Anomaly Detection [59.04634695294402]
Video anomaly detection (VAD) without human monitoring is a complex computer vision task.
Privacy leakage in VAD allows models to pick up and amplify unnecessary biases related to people's personal information.
We propose TeD-SPAD, a privacy-aware video anomaly detection framework that destroys visual private information in a self-supervised manner.
arXiv Detail & Related papers (2023-08-21T22:42:55Z) - Shuffled Differentially Private Federated Learning for Time Series Data
Analytics [10.198481976376717]
We develop a privacy-preserving federated learning algorithm for time series data.
Specifically, we employ local differential privacy to extend the privacy protection trust boundary to the clients.
We also incorporate shuffle techniques to achieve a privacy amplification, mitigating the accuracy decline caused by leveraging local differential privacy.
arXiv Detail & Related papers (2023-07-30T10:30:38Z) - PMP: Privacy-Aware Matrix Profile against Sensitive Pattern Inference
for Time Series [12.855499575586753]
We propose a new privacy-preserving problem: preventing malicious inference on long shape-based patterns.
We find that while Matrix Profile (MP) can prevent concrete shape leakage, the canonical correlation in MP index can still reveal the location of sensitive long pattern.
We propose a Privacy-Aware Matrix Profile (PMP) via perturbing the local correlation and breaking the canonical correlation in MP index vector.
arXiv Detail & Related papers (2023-01-04T22:11:38Z) - DP2-Pub: Differentially Private High-Dimensional Data Publication with
Invariant Post Randomization [58.155151571362914]
We propose a differentially private high-dimensional data publication mechanism (DP2-Pub) that runs in two phases.
splitting attributes into several low-dimensional clusters with high intra-cluster cohesion and low inter-cluster coupling helps obtain a reasonable privacy budget.
We also extend our DP2-Pub mechanism to the scenario with a semi-honest server which satisfies local differential privacy.
arXiv Detail & Related papers (2022-08-24T17:52:43Z) - Active Privacy-Utility Trade-off Against Inference in Time-Series Data
Sharing [29.738666406095074]
We consider a user releasing her data containing personal information in return of a service from an honest-but-curious service provider (SP)
We formulate both problems as partially observable Markov decision processes (POMDPs) and numerically solve them by advantage actor-critic (A2C) deep reinforcement learning (DRL)
We evaluate the privacy-utility trade-off (PUT) of the proposed policies on both the synthetic data and smoking activity dataset, and show their validity by testing the activity detection accuracy of the SP modeled by a long short-term memory (LSTM) neural network.
arXiv Detail & Related papers (2022-02-11T18:57:31Z) - Deep Directed Information-Based Learning for Privacy-Preserving Smart
Meter Data Release [30.409342804445306]
We study the problem in the context of time series data and smart meters (SMs) power consumption measurements.
We introduce the Directed Information (DI) as a more meaningful measure of privacy in the considered setting.
Our empirical studies on real-world data sets from SMs measurements in the worst-case scenario show the existing trade-offs between privacy and utility.
arXiv Detail & Related papers (2020-11-20T13:41:11Z) - Hide-and-Seek Privacy Challenge [88.49671206936259]
The NeurIPS 2020 Hide-and-Seek Privacy Challenge is a novel two-tracked competition to accelerate progress in tackling both problems.
In our head-to-head format, participants in the synthetic data generation track (i.e. "hiders") and the patient re-identification track (i.e. "seekers") are directly pitted against each other by way of a new, high-quality intensive care time-series dataset.
arXiv Detail & Related papers (2020-07-23T15:50:59Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.