Slippage-robust Gaze Tracking for Near-eye Display
- URL: http://arxiv.org/abs/2210.11637v1
- Date: Thu, 20 Oct 2022 23:47:56 GMT
- Title: Slippage-robust Gaze Tracking for Near-eye Display
- Authors: Wei Zhang, Jiaxi Cao, Xiang Wang, Enqi Tian and Bin Li
- Abstract summary: slippage of head-mounted devices (HMD) often results higher gaze tracking errors.
We propose a slippage-robust gaze tracking for near-eye display method based on the aspheric eyeball model.
- Score: 14.038708833057534
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In recent years, head-mounted near-eye display devices have become the key
hardware foundation for virtual reality and augmented reality. Thus
head-mounted gaze tracking technology has received attention as an essential
part of human-computer interaction. However, unavoidable slippage of
head-mounted devices (HMD) often results higher gaze tracking errors and
hinders the practical usage of HMD. To tackle this problem, we propose a
slippage-robust gaze tracking for near-eye display method based on the aspheric
eyeball model and accurately compute the eyeball optical axis and rotation
center. We tested several methods on datasets with slippage and the
experimental results show that the proposed method significantly outperforms
the previous method (almost double the suboptimal method).
Related papers
- DELTA: Dense Efficient Long-range 3D Tracking for any video [82.26753323263009]
We introduce DELTA, a novel method that efficiently tracks every pixel in 3D space, enabling accurate motion estimation across entire videos.
Our approach leverages a joint global-local attention mechanism for reduced-resolution tracking, followed by a transformer-based upsampler to achieve high-resolution predictions.
Our method provides a robust solution for applications requiring fine-grained, long-term motion tracking in 3D space.
arXiv Detail & Related papers (2024-10-31T17:59:01Z) - Binocular-Guided 3D Gaussian Splatting with View Consistency for Sparse View Synthesis [53.702118455883095]
We propose a novel method for synthesizing novel views from sparse views with Gaussian Splatting.
Our key idea lies in exploring the self-supervisions inherent in the binocular stereo consistency between each pair of binocular images.
Our method significantly outperforms the state-of-the-art methods.
arXiv Detail & Related papers (2024-10-24T15:10:27Z) - Using Deep Learning to Increase Eye-Tracking Robustness, Accuracy, and Precision in Virtual Reality [2.2639735235640015]
This work provides an objective assessment of the impact of several contemporary machine learning (ML)-based methods for eye feature tracking.
Metrics include the accuracy and precision of the gaze estimate, as well as drop-out rate.
arXiv Detail & Related papers (2024-03-28T18:43:25Z) - Deep Domain Adaptation: A Sim2Real Neural Approach for Improving Eye-Tracking Systems [80.62854148838359]
Eye image segmentation is a critical step in eye tracking that has great influence over the final gaze estimate.
We use dimensionality-reduction techniques to measure the overlap between the target eye images and synthetic training data.
Our methods result in robust, improved performance when tackling the discrepancy between simulation and real-world data samples.
arXiv Detail & Related papers (2024-03-23T22:32:06Z) - Open Gaze: Open Source eye tracker for smartphone devices using Deep Learning [0.0]
We present an open-source implementation of a smartphone-based gaze tracker that emulates the methodology proposed by a GooglePaper.
Through the integration of machine learning techniques, we unveil an accurate eye tracking solution that is native to smartphones.
Our findings exhibit the inherent potential to amplify eye movement research by significant proportions.
arXiv Detail & Related papers (2023-08-25T17:10:22Z) - Accurate Eye Tracking from Dense 3D Surface Reconstructions using Single-Shot Deflectometry [13.297188931807586]
We propose a novel method for accurate and fast evaluation of the gaze direction that exploits teachings from single-shot phase-measuring-deflectometry(PMD)
Our method acquires dense 3D surface information of both cornea and sclera within only one single camera frame (single-shot)
We show the feasibility of our approach with experimentally evaluated gaze errors on a realistic model eye below only $0.12circ$.
arXiv Detail & Related papers (2023-08-14T17:36:39Z) - Optimization-Based Eye Tracking using Deflectometric Information [14.010352335803873]
State-of-the-art eye tracking methods are either-based and track reflections of sparse point light sources, or image-based and exploit 2D features of the acquired eye image.
We develop a differentiable pipeline based on PyTorch3D that simulates a virtual eye under screen illumination.
In general, our method does not require a specific pattern rendering and can work with ordinary video frames of the main VR/AR/MR screen itself.
arXiv Detail & Related papers (2023-03-09T02:41:13Z) - A Deep Learning Approach for the Segmentation of Electroencephalography
Data in Eye Tracking Applications [56.458448869572294]
We introduce DETRtime, a novel framework for time-series segmentation of EEG data.
Our end-to-end deep learning-based framework brings advances in Computer Vision to the forefront.
Our model generalizes well in the task of EEG sleep stage segmentation.
arXiv Detail & Related papers (2022-06-17T10:17:24Z) - End-to-end Learning for Inter-Vehicle Distance and Relative Velocity
Estimation in ADAS with a Monocular Camera [81.66569124029313]
We propose a camera-based inter-vehicle distance and relative velocity estimation method based on end-to-end training of a deep neural network.
The key novelty of our method is the integration of multiple visual clues provided by any two time-consecutive monocular frames.
We also propose a vehicle-centric sampling mechanism to alleviate the effect of perspective distortion in the motion field.
arXiv Detail & Related papers (2020-06-07T08:18:31Z) - Gaze-Sensing LEDs for Head Mounted Displays [73.88424800314634]
We exploit the sensing capability of LEDs to create low-power gaze tracker for virtual reality (VR) applications.
We show that our gaze estimation method does not require complex dimension reduction techniques.
arXiv Detail & Related papers (2020-03-18T23:03:06Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.