Stop&Hop: Early Classification of Irregular Time Series
- URL: http://arxiv.org/abs/2208.09795v1
- Date: Sun, 21 Aug 2022 03:41:19 GMT
- Title: Stop&Hop: Early Classification of Irregular Time Series
- Authors: Thomas Hartvigsen, Walter Gerych, Jidapa Thadajarassiri, Xiangnan
Kong, Elke Rundensteiner
- Abstract summary: We study early classification of irregular time series, a new setting for early classifiers that opens doors to more real-world problems.
Our solution, Stop&Hop, uses a continuous-time recurrent network to model ongoing irregular time series in real time.
We demonstrate that Stop&Hop consistently makes earlier and more-accurate predictions than state-of-the-art alternatives adapted to this new problem.
- Score: 16.84487620364245
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Early classification algorithms help users react faster to their machine
learning model's predictions. Early warning systems in hospitals, for example,
let clinicians improve their patients' outcomes by accurately predicting
infections. While early classification systems are advancing rapidly, a major
gap remains: existing systems do not consider irregular time series, which have
uneven and often-long gaps between their observations. Such series are
notoriously pervasive in impactful domains like healthcare. We bridge this gap
and study early classification of irregular time series, a new setting for
early classifiers that opens doors to more real-world problems. Our solution,
Stop&Hop, uses a continuous-time recurrent network to model ongoing irregular
time series in real time, while an irregularity-aware halting policy, trained
with reinforcement learning, predicts when to stop and classify the streaming
series. By taking real-valued step sizes, the halting policy flexibly decides
exactly when to stop ongoing series in real time. This way, Stop&Hop seamlessly
integrates information contained in the timing of observations, a new and vital
source for early classification in this setting, with the time series values to
provide early classifications for irregular time series. Using four synthetic
and three real-world datasets, we demonstrate that Stop&Hop consistently makes
earlier and more-accurate predictions than state-of-the-art alternatives
adapted to this new problem. Our code is publicly available at
https://github.com/thartvigsen/StopAndHop.
Related papers
- Relational Conformal Prediction for Correlated Time Series [56.59852921638328]
We propose a novel distribution-free approach based on conformal prediction framework and quantile regression.
We fill this void by introducing a novel conformal prediction method based on graph deep learning operators.
Our approach provides accurate coverage and archives state-of-the-art uncertainty quantification in relevant benchmarks.
arXiv Detail & Related papers (2025-02-13T16:12:17Z) - An Introduction to Deep Survival Analysis Models for Predicting Time-to-Event Outcomes [5.257719744958367]
Time-to-event outcomes have been studied extensively within the field of survival analysis.
Monograph aims to provide a reasonably self-contained modern introduction to survival analysis.
arXiv Detail & Related papers (2024-10-01T21:29:17Z) - Deep learning for predicting the occurrence of tipping points [9.699777597848618]
We develop a deep learning algorithm for predicting the occurrence of tipping points in untrained systems.
Our algorithm not only outperforms traditional methods but also achieves accurate predictions for irregularly-sampled model time series.
Our ability to predict tipping points paves the way for mitigation risks, prevention of catastrophic failures, and restoration of degraded systems.
arXiv Detail & Related papers (2024-07-26T12:17:57Z) - TimeSiam: A Pre-Training Framework for Siamese Time-Series Modeling [67.02157180089573]
Time series pre-training has recently garnered wide attention for its potential to reduce labeling expenses and benefit various downstream tasks.
This paper proposes TimeSiam as a simple but effective self-supervised pre-training framework for Time series based on Siamese networks.
arXiv Detail & Related papers (2024-02-04T13:10:51Z) - LARA: A Light and Anti-overfitting Retraining Approach for Unsupervised
Time Series Anomaly Detection [49.52429991848581]
We propose a Light and Anti-overfitting Retraining Approach (LARA) for deep variational auto-encoder based time series anomaly detection methods (VAEs)
This work aims to make three novel contributions: 1) the retraining process is formulated as a convex problem and can converge at a fast rate as well as prevent overfitting; 2) designing a ruminate block, which leverages the historical data without the need to store them; and 3) mathematically proving that when fine-tuning the latent vector and reconstructed data, the linear formations can achieve the least adjusting errors between the ground truths and the fine-tuned ones.
arXiv Detail & Related papers (2023-10-09T12:36:16Z) - CenTime: Event-Conditional Modelling of Censoring in Survival Analysis [49.44664144472712]
We introduce CenTime, a novel approach to survival analysis that directly estimates the time to event.
Our method features an innovative event-conditional censoring mechanism that performs robustly even when uncensored data is scarce.
Our results indicate that CenTime offers state-of-the-art performance in predicting time-to-death while maintaining comparable ranking performance.
arXiv Detail & Related papers (2023-09-07T17:07:33Z) - CARLA: Self-supervised Contrastive Representation Learning for Time Series Anomaly Detection [53.83593870825628]
One main challenge in time series anomaly detection (TSAD) is the lack of labelled data in many real-life scenarios.
Most of the existing anomaly detection methods focus on learning the normal behaviour of unlabelled time series in an unsupervised manner.
We introduce a novel end-to-end self-supervised ContrAstive Representation Learning approach for time series anomaly detection.
arXiv Detail & Related papers (2023-08-18T04:45:56Z) - ECOTS: Early Classification in Open Time Series [0.0]
We show how to adapt any technique for learning to the Early Classification in Open Time Series (ECOTS)
We illustrate our methodology by transforming two state-of-the-art algorithms for the ECOTS scenario and report numerical experiments on a real dataset for predictive maintenance.
arXiv Detail & Related papers (2022-04-01T12:34:26Z) - Continual Test-Time Domain Adaptation [94.51284735268597]
Test-time domain adaptation aims to adapt a source pre-trained model to a target domain without using any source data.
CoTTA is easy to implement and can be readily incorporated in off-the-shelf pre-trained models.
arXiv Detail & Related papers (2022-03-25T11:42:02Z) - Meta-Forecasting by combining Global DeepRepresentations with Local
Adaptation [12.747008878068314]
We introduce a novel forecasting method called Meta Global-Local Auto-Regression (Meta-GLAR)
It adapts to each time series by learning in closed-form the mapping from the representations produced by a recurrent neural network (RNN) to one-step-ahead forecasts.
Our method is competitive with the state-of-the-art in out-of-sample forecasting accuracy reported in earlier work.
arXiv Detail & Related papers (2021-11-05T11:45:02Z) - When is Early Classification of Time Series Meaningful? [11.234740889286215]
We ask if we can classify a time series subsequence with sufficient accuracy and confidence after seeing only some prefix of a target pattern.
The idea is that the earlier classification would allow us to take immediate action, in a domain in which some practical interventions are possible.
In spite of the fact that there are dozens of papers on early classification of time series, it is not clear that any of them could ever work in a real-world setting.
arXiv Detail & Related papers (2021-02-23T04:42:05Z) - Predicting Temporal Sets with Deep Neural Networks [50.53727580527024]
We propose an integrated solution based on the deep neural networks for temporal sets prediction.
A unique perspective is to learn element relationship by constructing set-level co-occurrence graph.
We design an attention-based module to adaptively learn the temporal dependency of elements and sets.
arXiv Detail & Related papers (2020-06-20T03:29:02Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.