Imputation of Missing Data in Smooth Pursuit Eye Movements Using a Self-Attention-based Deep Learning Approach
- URL: http://arxiv.org/abs/2506.00545v1
- Date: Sat, 31 May 2025 13:10:30 GMT
- Title: Imputation of Missing Data in Smooth Pursuit Eye Movements Using a Self-Attention-based Deep Learning Approach
- Authors: Mehdi Bejani, Guillermo Perez-de-Arenaza-Pozo, Julián D. Arias-Londoño, Juan I. Godino-LLorente,
- Abstract summary: We propose a novel imputation framework using Self-Attention-based Imputation networks for time series.<n>We refine the imputed data using a custom made autoencoder, tailored to represent smooth pursuit eye movement sequences.<n>Results show a significant improvement in the accuracy of reconstructed eye movement sequences.
- Score: 0.0
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Missing data is a relevant issue in time series, especially in biomedical sequences such as those corresponding to smooth pursuit eye movements, which often contain gaps due to eye blinks and track losses, complicating the analysis and extraction of meaningful biomarkers. In this paper, a novel imputation framework is proposed using Self-Attention-based Imputation networks for time series, which leverages the power of deep learning and self-attention mechanisms to impute missing data. We further refine the imputed data using a custom made autoencoder, tailored to represent smooth pursuit eye movement sequences. The proposed approach was implemented using 5,504 sequences from 172 Parkinsonian patients and healthy controls. Results show a significant improvement in the accuracy of reconstructed eye movement sequences with respect to other state of the art techniques, substantially reducing the values for common time domain error metrics such as the mean absolute error, mean relative error, and root mean square error, while also preserving the signal's frequency domain characteristics. Moreover, it demonstrates robustness when large intervals of data are missing. This method offers an alternative solution for robustly handling missing data in time series, enhancing the reliability of smooth pursuit analysis for the screening and monitoring of neurodegenerative disorders.
Related papers
- Extreme Value Modelling of Feature Residuals for Anomaly Detection in Dynamic Graphs [14.8066991252587]
detecting anomalies in a temporal sequence of graphs can be applied to areas such as the detection of accidents in transport networks and cyber attacks in computer networks.<n>Existing methods for detecting abnormal graphs can suffer from multiple limitations, such as high false positive rates and difficulties with handling variable-sized graphs and non-trivial temporal dynamics.<n>We propose a technique where temporal dependencies are explicitly modelled via time series analysis of a large set of pertinent graph features, followed by using residuals to remove the dependencies.
arXiv Detail & Related papers (2024-10-08T05:00:53Z) - Detecting Anomalies in Dynamic Graphs via Memory enhanced Normality [39.476378833827184]
Anomaly detection in dynamic graphs presents a significant challenge due to the temporal evolution of graph structures and attributes.
We introduce a novel spatial- temporal memories-enhanced graph autoencoder (STRIPE)
STRIPE significantly outperforms existing methods with 5.8% improvement in AUC scores and 4.62X faster in training time.
arXiv Detail & Related papers (2024-03-14T02:26:10Z) - Graph Spatiotemporal Process for Multivariate Time Series Anomaly
Detection with Missing Values [67.76168547245237]
We introduce a novel framework called GST-Pro, which utilizes a graphtemporal process and anomaly scorer to detect anomalies.
Our experimental results show that the GST-Pro method can effectively detect anomalies in time series data and outperforms state-of-the-art methods.
arXiv Detail & Related papers (2024-01-11T10:10:16Z) - ImDiffusion: Imputed Diffusion Models for Multivariate Time Series
Anomaly Detection [44.21198064126152]
We propose a novel anomaly detection framework named ImDiffusion.
ImDiffusion combines time series imputation and diffusion models to achieve accurate and robust anomaly detection.
We evaluate the performance of ImDiffusion via extensive experiments on benchmark datasets.
arXiv Detail & Related papers (2023-07-03T04:57:40Z) - AnomalyBERT: Self-Supervised Transformer for Time Series Anomaly
Detection using Data Degradation Scheme [0.7216399430290167]
Anomaly detection task for time series, especially for unlabeled data, has been a challenging problem.
We address it by applying a suitable data degradation scheme to self-supervised model training.
Inspired by the self-attention mechanism, we design a Transformer-based architecture to recognize the temporal context.
arXiv Detail & Related papers (2023-05-08T05:42:24Z) - DynImp: Dynamic Imputation for Wearable Sensing Data Through Sensory and
Temporal Relatedness [78.98998551326812]
We argue that traditional methods have rarely made use of both times-series dynamics of the data as well as the relatedness of the features from different sensors.
We propose a model, termed as DynImp, to handle different time point's missingness with nearest neighbors along feature axis.
We show that the method can exploit the multi-modality features from related sensors and also learn from history time-series dynamics to reconstruct the data under extreme missingness.
arXiv Detail & Related papers (2022-09-26T21:59:14Z) - Self-Supervised Training with Autoencoders for Visual Anomaly Detection [61.62861063776813]
We focus on a specific use case in anomaly detection where the distribution of normal samples is supported by a lower-dimensional manifold.
We adapt a self-supervised learning regime that exploits discriminative information during training but focuses on the submanifold of normal examples.
We achieve a new state-of-the-art result on the MVTec AD dataset -- a challenging benchmark for visual anomaly detection in the manufacturing domain.
arXiv Detail & Related papers (2022-06-23T14:16:30Z) - An Attention-based ConvLSTM Autoencoder with Dynamic Thresholding for
Unsupervised Anomaly Detection in Multivariate Time Series [2.9685635948299995]
We propose an unsupervised Attention-based Convolutional Long Short-Term Memory (ConvLSTM) Autoencoder with Dynamic Thresholding (ACLAE-DT) framework for anomaly detection and diagnosis.
The framework starts by pre-processing and enriching the data, before constructing feature images to characterize the system statuses.
The constructed feature images are fed into an attention-based ConvLSTM autoencoder, which aims to encode the constructed feature images and capture the temporal behavior.
The reconstruction errors are then computed and subjected to a statistical-based, dynamic thresholding mechanism to detect and diagnose the anomalies
arXiv Detail & Related papers (2022-01-23T04:01:43Z) - Convolutional generative adversarial imputation networks for
spatio-temporal missing data in storm surge simulations [86.5302150777089]
Generative Adversarial Imputation Nets (GANs) and GAN-based techniques have attracted attention as unsupervised machine learning methods.
We name our proposed method as Con Conval Generative Adversarial Imputation Nets (Conv-GAIN)
arXiv Detail & Related papers (2021-11-03T03:50:48Z) - Deep Metric Learning with Locality Sensitive Angular Loss for
Self-Correcting Source Separation of Neural Spiking Signals [77.34726150561087]
We propose a methodology based on deep metric learning to address the need for automated post-hoc cleaning and robust separation filters.
We validate this method with an artificially corrupted label set based on source-separated high-density surface electromyography recordings.
This approach enables a neural network to learn to accurately decode neurophysiological time series using any imperfect method of labelling the signal.
arXiv Detail & Related papers (2021-10-13T21:51:56Z) - Self-supervised Transformer for Multivariate Clinical Time-Series with
Missing Values [7.9405251142099464]
We present STraTS (Self-supervised Transformer for TimeSeries) model.
It treats time-series as a set of observation triplets instead of using the traditional dense matrix representation.
It shows better prediction performance than state-of-theart methods for mortality prediction, especially when labeled data is limited.
arXiv Detail & Related papers (2021-07-29T19:39:39Z) - Consistency of mechanistic causal discovery in continuous-time using
Neural ODEs [85.7910042199734]
We consider causal discovery in continuous-time for the study of dynamical systems.
We propose a causal discovery algorithm based on penalized Neural ODEs.
arXiv Detail & Related papers (2021-05-06T08:48:02Z) - TadGAN: Time Series Anomaly Detection Using Generative Adversarial
Networks [73.01104041298031]
TadGAN is an unsupervised anomaly detection approach built on Generative Adversarial Networks (GANs)
To capture the temporal correlations of time series, we use LSTM Recurrent Neural Networks as base models for Generators and Critics.
To demonstrate the performance and generalizability of our approach, we test several anomaly scoring techniques and report the best-suited one.
arXiv Detail & Related papers (2020-09-16T15:52:04Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.