ECOTS: Early Classification in Open Time Series
- URL: http://arxiv.org/abs/2204.00392v1
- Date: Fri, 1 Apr 2022 12:34:26 GMT
- Title: ECOTS: Early Classification in Open Time Series
- Authors: Youssef Achenchabe, Alexis Bondu, Antoine Cornu\'ejols, Vincent
Lemaire
- Abstract summary: We show how to adapt any technique for learning to the Early Classification in Open Time Series (ECOTS)
We illustrate our methodology by transforming two state-of-the-art algorithms for the ECOTS scenario and report numerical experiments on a real dataset for predictive maintenance.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Learning to predict ahead of time events in open time series is challenging.
While Early Classification of Time Series (ECTS) tackles the problem of
balancing online the accuracy of the prediction with the cost of delaying the
decision when the individuals are time series of finite length with a unique
label for the whole time series. Surprisingly, this trade-off has never been
investigated for open time series with undetermined length and with different
classes for each subsequence of the same time series. In this paper, we propose
a principled method to adapt any technique for ECTS to the Early Classification
in Open Time Series (ECOTS). We show how the classifiers must be constructed
and what the decision triggering system becomes in this new scenario. We
address the challenge of decision making in the predictive maintenance field.
We illustrate our methodology by transforming two state-of-the-art ECTS
algorithms for the ECOTS scenario and report numerical experiments on a real
dataset for predictive maintenance that demonstrate the practicality of the
novel approach.
Related papers
- Relational Conformal Prediction for Correlated Time Series [56.59852921638328]
We propose a novel distribution-free approach based on conformal prediction framework and quantile regression.
We fill this void by introducing a novel conformal prediction method based on graph deep learning operators.
Our approach provides accurate coverage and archives state-of-the-art uncertainty quantification in relevant benchmarks.
arXiv Detail & Related papers (2025-02-13T16:12:17Z) - General Time-series Model for Universal Knowledge Representation of Multivariate Time-Series data [61.163542597764796]
We show that time series with different time granularities (or corresponding frequency resolutions) exhibit distinct joint distributions in the frequency domain.
A novel Fourier knowledge attention mechanism is proposed to enable learning time-aware representations from both the temporal and frequency domains.
An autoregressive blank infilling pre-training framework is incorporated to time series analysis for the first time, leading to a generative tasks agnostic pre-training strategy.
arXiv Detail & Related papers (2025-02-05T15:20:04Z) - TimeSiam: A Pre-Training Framework for Siamese Time-Series Modeling [67.02157180089573]
Time series pre-training has recently garnered wide attention for its potential to reduce labeling expenses and benefit various downstream tasks.
This paper proposes TimeSiam as a simple but effective self-supervised pre-training framework for Time series based on Siamese networks.
arXiv Detail & Related papers (2024-02-04T13:10:51Z) - Hierarchical Time Series Forecasting with Bayesian Modeling [0.0]
Time series are often hierarchically structured, e.g., a company sales might be broken down into different regions, and each region into different stores.
In some cases the number of series in the hierarchy is too big to fit in a single model to produce forecasts in relevant time, and a decentralized approach is beneficial.
One way to do this is to train independent forecasting models for each series and for some summary statistics series implied by the hierarchy (e.g. the sum of all series) and to pass those models to a reconciliation algorithm to improve those forecasts by sharing information between the series.
arXiv Detail & Related papers (2023-08-28T17:20:47Z) - Self-Interpretable Time Series Prediction with Counterfactual
Explanations [4.658166900129066]
Interpretable time series prediction is crucial for safety-critical areas such as healthcare and autonomous driving.
Most existing methods focus on interpreting predictions by assigning important scores to segments of time series.
We develop a self-interpretable model, dubbed Counterfactual Time Series (CounTS), which generates counterfactual and actionable explanations for time series predictions.
arXiv Detail & Related papers (2023-06-09T16:42:52Z) - Conformal Prediction for Time Series with Modern Hopfield Networks [6.749483762719583]
We propose HopCPT, a novel conformal prediction approach for time series.
We show that our approach is theoretically well justified for time series where temporal dependencies are present.
arXiv Detail & Related papers (2023-03-22T17:52:54Z) - Stop&Hop: Early Classification of Irregular Time Series [16.84487620364245]
We study early classification of irregular time series, a new setting for early classifiers that opens doors to more real-world problems.
Our solution, Stop&Hop, uses a continuous-time recurrent network to model ongoing irregular time series in real time.
We demonstrate that Stop&Hop consistently makes earlier and more-accurate predictions than state-of-the-art alternatives adapted to this new problem.
arXiv Detail & Related papers (2022-08-21T03:41:19Z) - Adaptive Conformal Predictions for Time Series [0.0]
We argue that Adaptive Conformal Inference (ACI) is a good procedure for time series with general dependency.
We propose a parameter-free method, AgACI, that adaptively builds upon ACI based on online expert aggregation.
We conduct a real case study: electricity price forecasting.
arXiv Detail & Related papers (2022-02-15T09:57:01Z) - TACTiS: Transformer-Attentional Copulas for Time Series [76.71406465526454]
estimation of time-varying quantities is a fundamental component of decision making in fields such as healthcare and finance.
We propose a versatile method that estimates joint distributions using an attention-based decoder.
We show that our model produces state-of-the-art predictions on several real-world datasets.
arXiv Detail & Related papers (2022-02-07T21:37:29Z) - Cluster-and-Conquer: A Framework For Time-Series Forecasting [94.63501563413725]
We propose a three-stage framework for forecasting high-dimensional time-series data.
Our framework is highly general, allowing for any time-series forecasting and clustering method to be used in each step.
When instantiated with simple linear autoregressive models, we are able to achieve state-of-the-art results on several benchmark datasets.
arXiv Detail & Related papers (2021-10-26T20:41:19Z) - Time Series Analysis via Network Science: Concepts and Algorithms [62.997667081978825]
This review provides a comprehensive overview of existing mapping methods for transforming time series into networks.
We describe the main conceptual approaches, provide authoritative references and give insight into their advantages and limitations in a unified notation and language.
Although still very recent, this research area has much potential and with this survey we intend to pave the way for future research on the topic.
arXiv Detail & Related papers (2021-10-11T13:33:18Z) - Model-Attentive Ensemble Learning for Sequence Modeling [86.4785354333566]
We present Model-Attentive Ensemble learning for Sequence modeling (MAES)
MAES is a mixture of time-series experts which leverages an attention-based gating mechanism to specialize the experts on different sequence dynamics and adaptively weight their predictions.
We demonstrate that MAES significantly out-performs popular sequence models on datasets subject to temporal shift.
arXiv Detail & Related papers (2021-02-23T05:23:35Z) - Predicting Temporal Sets with Deep Neural Networks [50.53727580527024]
We propose an integrated solution based on the deep neural networks for temporal sets prediction.
A unique perspective is to learn element relationship by constructing set-level co-occurrence graph.
We design an attention-based module to adaptively learn the temporal dependency of elements and sets.
arXiv Detail & Related papers (2020-06-20T03:29:02Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.