The Detection of Saccadic Eye Movements and Per-Eye Comparisons using Virtual Reality Eye Tracking Devices
- URL: http://arxiv.org/abs/2503.08926v1
- Date: Tue, 11 Mar 2025 22:15:39 GMT
- Title: The Detection of Saccadic Eye Movements and Per-Eye Comparisons using Virtual Reality Eye Tracking Devices
- Authors: Teran Bukenberger, Brent Davis,
- Abstract summary: The study involves VR eye tracking technology and neuroscience with respect to saccadic eye movements.<n>It is anticipated that the software will be able to accurately detect when saccades occur and analyze the differences in saccadic eye movements per-eye.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Eye tracking has been found to be useful in various tasks including diagnostic and screening tools. However, traditional eye trackers had a complicated setup and operated at a higher frequency to measure eye movements. The use of more commonly available eye trackers such as those in head-mounted virtual reality (VR) headsets greatly expands the utility of these eye trackers for research and analytical purposes. In this study, the research question is focused on detecting saccades, which is a common task when analyzing eye tracking data, but it is not well-established for VR headset-mounted eye trackers. The aim is to determine how accurately saccadic eye movements can be detected using an eye tracker that operates at 60 or 90Hz. The study involves VR eye tracking technology and neuroscience with respect to saccadic eye movements. The goal is to build prototype software implemented using VR eye tracking technology to detect saccadic eye movements, and per-eye differences in an individual. It is anticipated that the software will be able to accurately detect when saccades occur and analyze the differences in saccadic eye movements per-eye. The field of research surrounding VR eye tracking software is still developing rapidly, specifically its applications to neuroscience. Since previous methods of eye tracking involved specialized equipment, using commercially and consumer available VR eye tracking technology to assist in the detection of saccades and per-eye differences would be novel. This project will impact the field of neuroscience by providing a tool that can be used to detect saccadic eye movements and neurological and neurodegenerative disorders. However, this project is limited by the short time frame and that the eye tracker used in this study operates at a maximum frequency of 90Hz.
Related papers
- Exploring Eye Tracking to Detect Cognitive Load in Complex Virtual Reality Training [11.83314968015781]
We present an ongoing study to detect users' cognitive load using an eye-tracking-based machine learning approach.
We developed a VR training system for cold spray and tested it with 22 participants.
Preliminary analysis demonstrates the feasibility of using eye-tracking to detect cognitive load in complex VR experiences.
arXiv Detail & Related papers (2024-11-18T16:44:19Z) - EyeTrAES: Fine-grained, Low-Latency Eye Tracking via Adaptive Event Slicing [2.9795443606634917]
EyeTrAES is a novel approach using neuromorphic event cameras for high-fidelity tracking of natural pupillary movement.
We show that EyeTrAES boosts pupil tracking fidelity by 6+%, achieving IoU=92%, while incurring at least 3x lower latency than competing pure event-based eye tracking alternatives.
For robust user authentication, we train a lightweight per-user Random Forest classifier using a novel feature vector of short-term pupillary kinematics.
arXiv Detail & Related papers (2024-09-27T15:06:05Z) - A Framework for Pupil Tracking with Event Cameras [1.708806485130162]
Saccades are extremely rapid movements of both eyes that occur simultaneously.
The peak angular speed of the eye during a saccade can reach as high as 700deg/s in humans.
We present events as frames that can be readily utilized by standard deep learning algorithms.
arXiv Detail & Related papers (2024-07-23T17:32:02Z) - Neural feels with neural fields: Visuo-tactile perception for in-hand
manipulation [57.60490773016364]
We combine vision and touch sensing on a multi-fingered hand to estimate an object's pose and shape during in-hand manipulation.
Our method, NeuralFeels, encodes object geometry by learning a neural field online and jointly tracks it by optimizing a pose graph problem.
Our results demonstrate that touch, at the very least, refines and, at the very best, disambiguates visual estimates during in-hand manipulation.
arXiv Detail & Related papers (2023-12-20T22:36:37Z) - Open Gaze: Open Source eye tracker for smartphone devices using Deep Learning [0.0]
We present an open-source implementation of a smartphone-based gaze tracker that emulates the methodology proposed by a GooglePaper.
Through the integration of machine learning techniques, we unveil an accurate eye tracking solution that is native to smartphones.
Our findings exhibit the inherent potential to amplify eye movement research by significant proportions.
arXiv Detail & Related papers (2023-08-25T17:10:22Z) - QuestSim: Human Motion Tracking from Sparse Sensors with Simulated
Avatars [80.05743236282564]
Real-time tracking of human body motion is crucial for immersive experiences in AR/VR.
We present a reinforcement learning framework that takes in sparse signals from an HMD and two controllers.
We show that a single policy can be robust to diverse locomotion styles, different body sizes, and novel environments.
arXiv Detail & Related papers (2022-09-20T00:25:54Z) - A Deep Learning Approach for the Segmentation of Electroencephalography
Data in Eye Tracking Applications [56.458448869572294]
We introduce DETRtime, a novel framework for time-series segmentation of EEG data.
Our end-to-end deep learning-based framework brings advances in Computer Vision to the forefront.
Our model generalizes well in the task of EEG sleep stage segmentation.
arXiv Detail & Related papers (2022-06-17T10:17:24Z) - Learning Effect of Lay People in Gesture-Based Locomotion in Virtual
Reality [81.5101473684021]
Some of the most promising methods are gesture-based and do not require additional handheld hardware.
Recent work focused mostly on user preference and performance of the different locomotion techniques.
This work is investigated whether and how quickly users can adapt to a hand gesture-based locomotion system in VR.
arXiv Detail & Related papers (2022-06-16T10:44:16Z) - Do Pedestrians Pay Attention? Eye Contact Detection in the Wild [75.54077277681353]
In urban environments, humans rely on eye contact for fast and efficient communication with nearby people.
In this paper, we focus on eye contact detection in the wild, i.e., real-world scenarios for autonomous vehicles with no control over the environment or the distance of pedestrians.
We introduce a model that leverages semantic keypoints to detect eye contact and show that this high-level representation achieves state-of-the-art results on the publicly-available dataset JAAD.
To study domain adaptation, we create LOOK: a large-scale dataset for eye contact detection in the wild, which focuses on diverse and un
arXiv Detail & Related papers (2021-12-08T10:21:28Z) - MTCD: Cataract Detection via Near Infrared Eye Images [69.62768493464053]
cataract is a common eye disease and one of the leading causes of blindness and vision impairment.
We present a novel algorithm for cataract detection using near-infrared eye images.
Deep learning-based eye segmentation and multitask network classification networks are presented.
arXiv Detail & Related papers (2021-10-06T08:10:28Z) - Towards Hardware-Agnostic Gaze-Trackers [0.5512295869673146]
We present a deep neural network architecture as an appearance-based method for constrained gaze-tracking.
Our system achieved an error of 1.8073cm on GazeCapture dataset without any calibration or device specific fine-tuning.
arXiv Detail & Related papers (2020-10-11T00:53:57Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.