Differential Privacy for Eye Tracking with Temporal Correlations
- URL: http://arxiv.org/abs/2002.08972v3
- Date: Mon, 20 Dec 2021 09:18:03 GMT
- Title: Differential Privacy for Eye Tracking with Temporal Correlations
- Authors: Efe Bozkir and Onur G\"unl\"u and Wolfgang Fuhl and Rafael F. Schaefer
and Enkelejda Kasneci
- Abstract summary: New generation head-mounted displays, such as VR and AR glasses, are coming into the market with already integrated eye tracking.
Since eye movement properties contain biometric information, privacy concerns have to be handled properly.
We propose a novel transform-coding based differential privacy mechanism to further adapt it to the statistics of eye movement feature data.
- Score: 30.44437258959343
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: New generation head-mounted displays, such as VR and AR glasses, are coming
into the market with already integrated eye tracking and are expected to enable
novel ways of human-computer interaction in numerous applications. However,
since eye movement properties contain biometric information, privacy concerns
have to be handled properly. Privacy-preservation techniques such as
differential privacy mechanisms have recently been applied to eye movement data
obtained from such displays. Standard differential privacy mechanisms; however,
are vulnerable due to temporal correlations between the eye movement
observations. In this work, we propose a novel transform-coding based
differential privacy mechanism to further adapt it to the statistics of eye
movement feature data and compare various low-complexity methods. We extend the
Fourier perturbation algorithm, which is a differential privacy mechanism, and
correct a scaling mistake in its proof. Furthermore, we illustrate significant
reductions in sample correlations in addition to query sensitivities, which
provide the best utility-privacy trade-off in the eye tracking literature. Our
results provide significantly high privacy without any essential loss in
classification accuracies while hiding personal identifiers.
Related papers
- Privacy-Preserving Gaze Data Streaming in Immersive Interactive Virtual Reality: Robustness and User Experience [11.130411904676095]
Eye tracking data, if exposed, can be used for re-identification attacks.
We develop a methodology to evaluate real-time privacy mechanisms for interactive VR applications.
arXiv Detail & Related papers (2024-02-12T14:53:12Z) - A Unified View of Differentially Private Deep Generative Modeling [60.72161965018005]
Data with privacy concerns comes with stringent regulations that frequently prohibited data access and data sharing.
Overcoming these obstacles is key for technological progress in many real-world application scenarios that involve privacy sensitive data.
Differentially private (DP) data publishing provides a compelling solution, where only a sanitized form of the data is publicly released.
arXiv Detail & Related papers (2023-09-27T14:38:16Z) - Diff-Privacy: Diffusion-based Face Privacy Protection [58.1021066224765]
In this paper, we propose a novel face privacy protection method based on diffusion models, dubbed Diff-Privacy.
Specifically, we train our proposed multi-scale image inversion module (MSI) to obtain a set of SDM format conditional embeddings of the original image.
Based on the conditional embeddings, we design corresponding embedding scheduling strategies and construct different energy functions during the denoising process to achieve anonymization and visual identity information hiding.
arXiv Detail & Related papers (2023-09-11T09:26:07Z) - TeD-SPAD: Temporal Distinctiveness for Self-supervised
Privacy-preservation for video Anomaly Detection [59.04634695294402]
Video anomaly detection (VAD) without human monitoring is a complex computer vision task.
Privacy leakage in VAD allows models to pick up and amplify unnecessary biases related to people's personal information.
We propose TeD-SPAD, a privacy-aware video anomaly detection framework that destroys visual private information in a self-supervised manner.
arXiv Detail & Related papers (2023-08-21T22:42:55Z) - Multimodal Adaptive Fusion of Face and Gait Features using Keyless
attention based Deep Neural Networks for Human Identification [67.64124512185087]
Soft biometrics such as gait are widely used with face in surveillance tasks like person recognition and re-identification.
We propose a novel adaptive multi-biometric fusion strategy for the dynamic incorporation of gait and face biometric cues by leveraging keyless attention deep neural networks.
arXiv Detail & Related papers (2023-03-24T05:28:35Z) - Disguise without Disruption: Utility-Preserving Face De-Identification [40.484745636190034]
We introduce Disguise, a novel algorithm that seamlessly de-identifies facial images while ensuring the usability of the modified data.
Our method involves extracting and substituting depicted identities with synthetic ones, generated using variational mechanisms to maximize obfuscation and non-invertibility.
We extensively evaluate our method using multiple datasets, demonstrating a higher de-identification rate and superior consistency compared to prior approaches in various downstream tasks.
arXiv Detail & Related papers (2023-03-23T13:50:46Z) - On the Privacy Effect of Data Enhancement via the Lens of Memorization [20.63044895680223]
We propose to investigate privacy from a new perspective called memorization.
Through the lens of memorization, we find that previously deployed MIAs produce misleading results as they are less likely to identify samples with higher privacy risks.
We demonstrate that the generalization gap and privacy leakage are less correlated than those of the previous results.
arXiv Detail & Related papers (2022-08-17T13:02:17Z) - Privacy-Preserving Face Recognition with Learnable Privacy Budgets in
Frequency Domain [77.8858706250075]
This paper proposes a privacy-preserving face recognition method using differential privacy in the frequency domain.
Our method performs very well with several classical face recognition test sets.
arXiv Detail & Related papers (2022-07-15T07:15:36Z) - Assessing Differentially Private Variational Autoencoders under
Membership Inference [26.480694390462617]
We quantify and compare the privacy-accuracy trade-off for differentially private Variational Autoencoders.
We do rarely observe favorable privacy-accuracy trade-off for Variational Autoencoders, and identify a case where LDP outperforms CDP.
arXiv Detail & Related papers (2022-04-16T21:53:09Z) - Robustness Threats of Differential Privacy [70.818129585404]
We experimentally demonstrate that networks, trained with differential privacy, in some settings might be even more vulnerable in comparison to non-private versions.
We study how the main ingredients of differentially private neural networks training, such as gradient clipping and noise addition, affect the robustness of the model.
arXiv Detail & Related papers (2020-12-14T18:59:24Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.