Eye-tracked Virtual Reality: A Comprehensive Survey on Methods and
Privacy Challenges
- URL: http://arxiv.org/abs/2305.14080v1
- Date: Tue, 23 May 2023 14:02:38 GMT
- Title: Eye-tracked Virtual Reality: A Comprehensive Survey on Methods and
Privacy Challenges
- Authors: Efe Bozkir and S\"uleyman \"Ozdel and Mengdi Wang and Brendan
David-John and Hong Gao and Kevin Butler and Eakta Jain and Enkelejda Kasneci
- Abstract summary: This survey focuses on eye tracking in virtual reality (VR) and the privacy implications of those possibilities.
We first cover major works in eye tracking, VR, and privacy areas between the years 2012 and 2022.
We focus on eye-based authentication as well as computational methods to preserve the privacy of individuals and their eye-tracking data in VR.
- Score: 33.50215933003216
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Latest developments in computer hardware, sensor technologies, and artificial
intelligence can make virtual reality (VR) and virtual spaces an important part
of human everyday life. Eye tracking offers not only a hands-free way of
interaction but also the possibility of a deeper understanding of human visual
attention and cognitive processes in VR. Despite these possibilities,
eye-tracking data also reveal privacy-sensitive attributes of users when it is
combined with the information about the presented stimulus. To address these
possibilities and potential privacy issues, in this survey, we first cover
major works in eye tracking, VR, and privacy areas between the years 2012 and
2022. While eye tracking in the VR part covers the complete pipeline of
eye-tracking methodology from pupil detection and gaze estimation to offline
use and analyses, as for privacy and security, we focus on eye-based
authentication as well as computational methods to preserve the privacy of
individuals and their eye-tracking data in VR. Later, taking all into
consideration, we draw three main directions for the research community by
mainly focusing on privacy challenges. In summary, this survey provides an
extensive literature review of the utmost possibilities with eye tracking in VR
and the privacy implications of those possibilities.
Related papers
- An Empirical Study on Oculus Virtual Reality Applications: Security and
Privacy Perspectives [46.995904896724994]
This paper develops a security and privacy assessment tool, namely the VR-SP detector for VR apps.
Using the VR-SP detector, we conduct a comprehensive empirical study on 500 popular VR apps.
We find that a number of security vulnerabilities and privacy leaks widely exist in VR apps.
arXiv Detail & Related papers (2024-02-21T13:53:25Z) - Privacy-Preserving Gaze Data Streaming in Immersive Interactive Virtual Reality: Robustness and User Experience [11.130411904676095]
Eye tracking data, if exposed, can be used for re-identification attacks.
We develop a methodology to evaluate real-time privacy mechanisms for interactive VR applications.
arXiv Detail & Related papers (2024-02-12T14:53:12Z) - Computer Vision for Primate Behavior Analysis in the Wild [61.08941894580172]
Video-based behavioral monitoring has great potential for transforming how we study animal cognition and behavior.
There is still a fairly large gap between the exciting prospects and what can actually be achieved in practice today.
arXiv Detail & Related papers (2024-01-29T18:59:56Z) - Deep Motion Masking for Secure, Usable, and Scalable Real-Time Anonymization of Virtual Reality Motion Data [49.68609500290361]
Recent studies have demonstrated that the motion tracking "telemetry" data used by nearly all VR applications is as uniquely identifiable as a fingerprint scan.
We present in this paper a state-of-the-art VR identification model that can convincingly bypass known defensive countermeasures.
arXiv Detail & Related papers (2023-11-09T01:34:22Z) - 3D Gaze Vis: Sharing Eye Tracking Data Visualization for Collaborative
Work in VR Environment [3.3130410344903325]
We designed three different eye tracking data visualizations: gaze cursor, gaze spotlight and gaze trajectory in VR scene for a course of human heart.
We found that gaze cursor from doctors could help students learn complex 3D heart models more effectively.
It indicated that sharing eye tracking data visualization could improve the quality and efficiency of collaborative work in the VR environment.
arXiv Detail & Related papers (2023-03-19T12:00:53Z) - Unique Identification of 50,000+ Virtual Reality Users from Head & Hand
Motion Data [58.27542320038834]
We show that a large number of real VR users can be uniquely and reliably identified across multiple sessions using just their head and hand motion.
After training a classification model on 5 minutes of data per person, a user can be uniquely identified amongst the entire pool of 50,000+ with 94.33% accuracy from 100 seconds of motion.
This work is the first to truly demonstrate the extent to which biomechanics may serve as a unique identifier in VR, on par with widely used biometrics such as facial or fingerprint recognition.
arXiv Detail & Related papers (2023-02-17T15:05:18Z) - Privacy concerns from variances in spatial navigability in VR [0.0]
Current Virtual Reality (VR) input devices make it possible to navigate a virtual environment and record immersive, personalized data regarding the user's movement and specific behavioral habits.
In this article, the authors propose to investigate Machine Learning driven learning algorithms that try to learn with human users co-operatively and can be used to countermand existing privacy concerns in VR.
arXiv Detail & Related papers (2023-02-06T01:48:59Z) - Towards Everyday Virtual Reality through Eye Tracking [1.2691047660244335]
Eye tracking is an emerging technology that helps to assess human behaviors in a real time and non-intrusive way.
A significant scientific push towards everyday virtual reality has been completed with three main research contributions.
arXiv Detail & Related papers (2022-03-29T16:09:37Z) - Do Pedestrians Pay Attention? Eye Contact Detection in the Wild [75.54077277681353]
In urban environments, humans rely on eye contact for fast and efficient communication with nearby people.
In this paper, we focus on eye contact detection in the wild, i.e., real-world scenarios for autonomous vehicles with no control over the environment or the distance of pedestrians.
We introduce a model that leverages semantic keypoints to detect eye contact and show that this high-level representation achieves state-of-the-art results on the publicly-available dataset JAAD.
To study domain adaptation, we create LOOK: a large-scale dataset for eye contact detection in the wild, which focuses on diverse and un
arXiv Detail & Related papers (2021-12-08T10:21:28Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.