Privacy-Preserving Gaze Data Streaming in Immersive Interactive Virtual Reality: Robustness and User Experience
- URL: http://arxiv.org/abs/2402.07687v2
- Date: Wed, 21 Feb 2024 15:46:16 GMT
- Title: Privacy-Preserving Gaze Data Streaming in Immersive Interactive Virtual Reality: Robustness and User Experience
- Authors: Ethan Wilson, Azim Ibragimov, Michael J. Proulx, Sai Deep Tetali, Kevin Butler, Eakta Jain,
- Abstract summary: Eye tracking data, if exposed, can be used for re-identification attacks.
We develop a methodology to evaluate real-time privacy mechanisms for interactive VR applications.
- Score: 11.130411904676095
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Eye tracking is routinely being incorporated into virtual reality (VR) systems. Prior research has shown that eye tracking data, if exposed, can be used for re-identification attacks. The state of our knowledge about currently existing privacy mechanisms is limited to privacy-utility trade-off curves based on data-centric metrics of utility, such as prediction error, and black-box threat models. We propose that for interactive VR applications, it is essential to consider user-centric notions of utility and a variety of threat models. We develop a methodology to evaluate real-time privacy mechanisms for interactive VR applications that incorporate subjective user experience and task performance metrics. We evaluate selected privacy mechanisms using this methodology and find that re-identification accuracy can be decreased to as low as 14% while maintaining a high usability score and reasonable task performance. Finally, we elucidate three threat scenarios (black-box, black-box with exemplars, and white-box) and assess how well the different privacy mechanisms hold up to these adversarial scenarios. This work advances the state of the art in VR privacy by providing a methodology for end-to-end assessment of the risk of re-identification attacks and potential mitigating solutions.
Related papers
- Privacy-Preserving Video Anomaly Detection: A Survey [10.899433437231139]
Video Anomaly Detection (VAD) aims to automatically analyze patterns in surveillance videos collected from open spaces to detect anomalous events that may cause harm without physical contact.
The lack of transparency in video transmission and usage raises public concerns about privacy and ethics limiting the real-world application of VAD.
Recently, researchers have focused on privacy concerns in VAD by conducting systematic studies from various perspectives including data, features, and systems.
This article systematically reviews progress in P2VAD for the first time, defining its scope and providing an intuitive taxonomy.
arXiv Detail & Related papers (2024-11-21T20:29:59Z) - Deep Motion Masking for Secure, Usable, and Scalable Real-Time Anonymization of Virtual Reality Motion Data [49.68609500290361]
Recent studies have demonstrated that the motion tracking "telemetry" data used by nearly all VR applications is as uniquely identifiable as a fingerprint scan.
We present in this paper a state-of-the-art VR identification model that can convincingly bypass known defensive countermeasures.
arXiv Detail & Related papers (2023-11-09T01:34:22Z) - A Unified View of Differentially Private Deep Generative Modeling [60.72161965018005]
Data with privacy concerns comes with stringent regulations that frequently prohibited data access and data sharing.
Overcoming these obstacles is key for technological progress in many real-world application scenarios that involve privacy sensitive data.
Differentially private (DP) data publishing provides a compelling solution, where only a sanitized form of the data is publicly released.
arXiv Detail & Related papers (2023-09-27T14:38:16Z) - TeD-SPAD: Temporal Distinctiveness for Self-supervised
Privacy-preservation for video Anomaly Detection [59.04634695294402]
Video anomaly detection (VAD) without human monitoring is a complex computer vision task.
Privacy leakage in VAD allows models to pick up and amplify unnecessary biases related to people's personal information.
We propose TeD-SPAD, a privacy-aware video anomaly detection framework that destroys visual private information in a self-supervised manner.
arXiv Detail & Related papers (2023-08-21T22:42:55Z) - Your Room is not Private: Gradient Inversion Attack on Reinforcement
Learning [47.96266341738642]
Privacy emerges as a pivotal concern within the realm of embodied AI, as the robot accesses substantial personal information.
This paper proposes an attack on the value-based algorithm and the gradient-based algorithm, utilizing gradient inversion to reconstruct states, actions, and supervision signals.
arXiv Detail & Related papers (2023-06-15T16:53:26Z) - Eye-tracked Virtual Reality: A Comprehensive Survey on Methods and
Privacy Challenges [33.50215933003216]
This survey focuses on eye tracking in virtual reality (VR) and the privacy implications of those possibilities.
We first cover major works in eye tracking, VR, and privacy areas between the years 2012 and 2022.
We focus on eye-based authentication as well as computational methods to preserve the privacy of individuals and their eye-tracking data in VR.
arXiv Detail & Related papers (2023-05-23T14:02:38Z) - Towards Zero-trust Security for the Metaverse [14.115124942695887]
We develop a holistic research agenda for zero-trust user authentication in social virtual reality (VR)
Our proposed research includes four concrete steps: investigating biometrics-based authentication that is suitable for continuously authenticating VR users, leveraging federated learning for protecting user privacy in biometric data, improving the accuracy of continuous VR authentication with multimodal data, and boosting the usability of zero-trust security with adaptive VR authentication.
arXiv Detail & Related papers (2023-02-17T14:13:02Z) - OPOM: Customized Invisible Cloak towards Face Privacy Protection [58.07786010689529]
We investigate the face privacy protection from a technology standpoint based on a new type of customized cloak.
We propose a new method, named one person one mask (OPOM), to generate person-specific (class-wise) universal masks.
The effectiveness of the proposed method is evaluated on both common and celebrity datasets.
arXiv Detail & Related papers (2022-05-24T11:29:37Z) - Robustness Threats of Differential Privacy [70.818129585404]
We experimentally demonstrate that networks, trained with differential privacy, in some settings might be even more vulnerable in comparison to non-private versions.
We study how the main ingredients of differentially private neural networks training, such as gradient clipping and noise addition, affect the robustness of the model.
arXiv Detail & Related papers (2020-12-14T18:59:24Z) - Differential Privacy for Eye Tracking with Temporal Correlations [30.44437258959343]
New generation head-mounted displays, such as VR and AR glasses, are coming into the market with already integrated eye tracking.
Since eye movement properties contain biometric information, privacy concerns have to be handled properly.
We propose a novel transform-coding based differential privacy mechanism to further adapt it to the statistics of eye movement feature data.
arXiv Detail & Related papers (2020-02-20T19:01:34Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.