GaitGuard: Towards Private Gait in Mixed Reality
- URL: http://arxiv.org/abs/2312.04470v3
- Date: Tue, 4 Jun 2024 05:13:26 GMT
- Title: GaitGuard: Towards Private Gait in Mixed Reality
- Authors: Diana Romero, Ruchi Jagdish Patel, Athina Markopoulou, Salma Elmalaki,
- Abstract summary: GaitGuard is the first real-time framework designed to protect the privacy of gait features within the camera view of AR/MR devices.
GaitGuard reduces the risk of identification by up to 68%, while maintaining a minimal latency of merely 118.77 ms.
- Score: 3.2392550445029396
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Augmented/Mixed Reality (AR/MR) technologies offers a new era of immersive, collaborative experiences, distinctively setting them apart from conventional mobile systems. However, as we further investigate the privacy and security implications within these environments, the issue of gait privacy emerges as a critical yet underexplored concern. Given its uniqueness as a biometric identifier that can be correlated to several sensitive attributes, the protection of gait information becomes crucial in preventing potential identity tracking and unauthorized profiling within these systems. In this paper, we conduct a user study with 20 participants to assess the risk of individual identification through gait feature analysis extracted from video feeds captured by MR devices. Our results show the capability to uniquely identify individuals with an accuracy of up to 92%, underscoring an urgent need for effective gait privacy protection measures. Through rigorous evaluation, we present a comparative analysis of various mitigation techniques, addressing both aware and unaware adversaries, in terms of their utility and impact on privacy preservation. From these evaluations, we introduce GaitGuard, the first real-time framework designed to protect the privacy of gait features within the camera view of AR/MR devices. Our evaluations of GaitGuard within a MR collaborative scenario demonstrate its effectiveness in implementing mitigation that reduces the risk of identification by up to 68%, while maintaining a minimal latency of merely 118.77 ms, thus marking a critical step forward in safeguarding privacy within AR/MR ecosystems.
Related papers
- Differentially Private Integrated Decision Gradients (IDG-DP) for Radar-based Human Activity Recognition [5.955900146668931]
Recent research has shown high accuracy in recognizing subjects or gender from radar gait patterns, raising privacy concerns.
This study addresses these issues by investigating privacy vulnerabilities in radar-based Human Activity Recognition (HAR) systems.
We propose a novel method for privacy preservation using Differential Privacy (DP) driven by attributions derived with Integrated Decision Gradient (IDG) algorithm.
arXiv Detail & Related papers (2024-11-04T14:08:26Z) - Collaborative Inference over Wireless Channels with Feature Differential Privacy [57.68286389879283]
Collaborative inference among multiple wireless edge devices has the potential to significantly enhance Artificial Intelligence (AI) applications.
transmitting extracted features poses a significant privacy risk, as sensitive personal data can be exposed during the process.
We propose a novel privacy-preserving collaborative inference mechanism, wherein each edge device in the network secures the privacy of extracted features before transmitting them to a central server for inference.
arXiv Detail & Related papers (2024-10-25T18:11:02Z) - Privacy-Preserving Gaze Data Streaming in Immersive Interactive Virtual Reality: Robustness and User Experience [11.130411904676095]
Eye tracking data, if exposed, can be used for re-identification attacks.
We develop a methodology to evaluate real-time privacy mechanisms for interactive VR applications.
arXiv Detail & Related papers (2024-02-12T14:53:12Z) - Protect Your Score: Contact Tracing With Differential Privacy Guarantees [68.53998103087508]
We argue that privacy concerns currently hold deployment back.
We propose a contact tracing algorithm with differential privacy guarantees against this attack.
Especially for realistic test scenarios, we achieve a two to ten-fold reduction in the infection rate of the virus.
arXiv Detail & Related papers (2023-12-18T11:16:33Z) - Diff-Privacy: Diffusion-based Face Privacy Protection [58.1021066224765]
In this paper, we propose a novel face privacy protection method based on diffusion models, dubbed Diff-Privacy.
Specifically, we train our proposed multi-scale image inversion module (MSI) to obtain a set of SDM format conditional embeddings of the original image.
Based on the conditional embeddings, we design corresponding embedding scheduling strategies and construct different energy functions during the denoising process to achieve anonymization and visual identity information hiding.
arXiv Detail & Related papers (2023-09-11T09:26:07Z) - A Randomized Approach for Tight Privacy Accounting [63.67296945525791]
We propose a new differential privacy paradigm called estimate-verify-release (EVR)
EVR paradigm first estimates the privacy parameter of a mechanism, then verifies whether it meets this guarantee, and finally releases the query output.
Our empirical evaluation shows the newly proposed EVR paradigm improves the utility-privacy tradeoff for privacy-preserving machine learning.
arXiv Detail & Related papers (2023-04-17T00:38:01Z) - On the Privacy Effect of Data Enhancement via the Lens of Memorization [20.63044895680223]
We propose to investigate privacy from a new perspective called memorization.
Through the lens of memorization, we find that previously deployed MIAs produce misleading results as they are less likely to identify samples with higher privacy risks.
We demonstrate that the generalization gap and privacy leakage are less correlated than those of the previous results.
arXiv Detail & Related papers (2022-08-17T13:02:17Z) - Privacy-Aware Adversarial Network in Human Mobility Prediction [11.387235721659378]
User re-identification and other sensitive inferences are major privacy threats when geolocated data are shared with cloud-assisted applications.
We propose an LSTM-based adversarial representation learning to attain a privacy-preserving feature representation of the original geolocated data.
We show that the privacy of mobility traces attains decent protection at the cost of marginal mobility utility.
arXiv Detail & Related papers (2022-08-09T19:23:13Z) - PrivHAR: Recognizing Human Actions From Privacy-preserving Lens [58.23806385216332]
We propose an optimizing framework to provide robust visual privacy protection along the human action recognition pipeline.
Our framework parameterizes the camera lens to successfully degrade the quality of the videos to inhibit privacy attributes and protect against adversarial attacks.
arXiv Detail & Related papers (2022-06-08T13:43:29Z) - Partial sensitivity analysis in differential privacy [58.730520380312676]
We investigate the impact of each input feature on the individual's privacy loss.
We experimentally evaluate our approach on queries over private databases.
We also explore our findings in the context of neural network training on synthetic data.
arXiv Detail & Related papers (2021-09-22T08:29:16Z) - Privacy for Rescue: A New Testimony Why Privacy is Vulnerable In Deep
Models [6.902994369582068]
We present a formal definition of the privacy protection problem in the edge-cloud system running models.
We analyze the-state-of-the-art methods and point out the drawbacks of their methods.
We propose two new metrics that are more accurate to measure the effectiveness of privacy protection methods.
arXiv Detail & Related papers (2019-12-31T15:55:03Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.