AutoFITS: Automatic Feature Engineering for Irregular Time Series
- URL: http://arxiv.org/abs/2112.14806v1
- Date: Wed, 29 Dec 2021 19:42:48 GMT
- Title: AutoFITS: Automatic Feature Engineering for Irregular Time Series
- Authors: Pedro Costa, Vitor Cerqueira, Jo\~ao Vinagre
- Abstract summary: In irregular time series, the time at which each observation is collected may be helpful to summarise the dynamics of the data and improve forecasting performance.
We develop a novel automatic feature engineering framework, which focuses on extracting information from this point of view when each instance is collected.
We study how valuable this information is by integrating it in a time series forecasting workflow and investigate how it compares to or complements state-of-the-art methods for regular time series forecasting.
- Score: 0.44198435146063353
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: A time series represents a set of observations collected over time.
Typically, these observations are captured with a uniform sampling frequency
(e.g. daily). When data points are observed in uneven time intervals the time
series is referred to as irregular or intermittent. In such scenarios, the most
common solution is to reconstruct the time series to make it regular, thus
removing its intermittency. We hypothesise that, in irregular time series, the
time at which each observation is collected may be helpful to summarise the
dynamics of the data and improve forecasting performance. We study this idea by
developing a novel automatic feature engineering framework, which focuses on
extracting information from this point of view, i.e., when each instance is
collected. We study how valuable this information is by integrating it in a
time series forecasting workflow and investigate how it compares to or
complements state-of-the-art methods for regular time series forecasting. In
the end, we contribute by providing a novel framework that tackles feature
engineering for time series from an angle previously vastly ignored. We show
that our approach has the potential to further extract more information about
time series that significantly improves forecasting performance.
Related papers
- Time Series Data Augmentation as an Imbalanced Learning Problem [2.5536554335016417]
We use oversampling strategies to create synthetic time series observations and improve the accuracy of forecasting models.
We carried out experiments using 7 different databases that contain a total of 5502 univariate time series.
We found that the proposed solution outperforms both a global and a local model, thus providing a better trade-off between these two approaches.
arXiv Detail & Related papers (2024-04-29T09:27:15Z) - Unified Training of Universal Time Series Forecasting Transformers [104.56318980466742]
We present a Masked-based Universal Time Series Forecasting Transformer (Moirai)
Moirai is trained on our newly introduced Large-scale Open Time Series Archive (LOTSA) featuring over 27B observations across nine domains.
Moirai achieves competitive or superior performance as a zero-shot forecaster when compared to full-shot models.
arXiv Detail & Related papers (2024-02-04T20:00:45Z) - TimeSiam: A Pre-Training Framework for Siamese Time-Series Modeling [67.02157180089573]
Time series pre-training has recently garnered wide attention for its potential to reduce labeling expenses and benefit various downstream tasks.
This paper proposes TimeSiam as a simple but effective self-supervised pre-training framework for Time series based on Siamese networks.
arXiv Detail & Related papers (2024-02-04T13:10:51Z) - Graph Spatiotemporal Process for Multivariate Time Series Anomaly
Detection with Missing Values [67.76168547245237]
We introduce a novel framework called GST-Pro, which utilizes a graphtemporal process and anomaly scorer to detect anomalies.
Our experimental results show that the GST-Pro method can effectively detect anomalies in time series data and outperforms state-of-the-art methods.
arXiv Detail & Related papers (2024-01-11T10:10:16Z) - One-Step Time Series Forecasting Using Variational Quantum Circuits [1.1934558041641545]
Machine learning scientists define a time series as a set of observations recorded over consistent time steps.
Time is of great essence in this forecasting as it shows how the data coordinates over the dataset and the final result.
Quantum computers may prove to be a better option for perceiving the trends in the time series by exploiting quantum mechanical phenomena.
arXiv Detail & Related papers (2022-07-16T16:50:28Z) - Split Time Series into Patches: Rethinking Long-term Series Forecasting
with Dateformer [17.454822366228335]
Time is one of the most significant characteristics of time-series, yet has received insufficient attention.
We propose Dateformer who turns attention to modeling time instead of following the above practice.
Dateformer yields state-of-the-art accuracy with a 40% remarkable relative improvement, and broadens the maximum credible forecasting range to a half-yearly level.
arXiv Detail & Related papers (2022-07-12T08:58:44Z) - Towards Spatio-Temporal Aware Traffic Time Series Forecasting--Full
Version [37.09531298150374]
Traffic series forecasting is challenging due to complex time series patterns for the same time series patterns may vary across time, where, for example, there exist periods across a day showing stronger temporal correlations.
Such-temporal models employ a shared parameter space irrespective of the time locations and the time periods and they assume that the temporal correlations are similar across locations and do not always hold across time which may not always be the case.
We propose a framework that aims at turning ICD-temporal aware models to encode sub-temporal models.
arXiv Detail & Related papers (2022-03-29T16:44:56Z) - Cluster-and-Conquer: A Framework For Time-Series Forecasting [94.63501563413725]
We propose a three-stage framework for forecasting high-dimensional time-series data.
Our framework is highly general, allowing for any time-series forecasting and clustering method to be used in each step.
When instantiated with simple linear autoregressive models, we are able to achieve state-of-the-art results on several benchmark datasets.
arXiv Detail & Related papers (2021-10-26T20:41:19Z) - Novel Features for Time Series Analysis: A Complex Networks Approach [62.997667081978825]
Time series data are ubiquitous in several domains as climate, economics and health care.
Recent conceptual approach relies on time series mapping to complex networks.
Network analysis can be used to characterize different types of time series.
arXiv Detail & Related papers (2021-10-11T13:46:28Z) - Deep Autoregressive Models with Spectral Attention [74.08846528440024]
We propose a forecasting architecture that combines deep autoregressive models with a Spectral Attention (SA) module.
By characterizing in the spectral domain the embedding of the time series as occurrences of a random process, our method can identify global trends and seasonality patterns.
Two spectral attention models, global and local to the time series, integrate this information within the forecast and perform spectral filtering to remove time series's noise.
arXiv Detail & Related papers (2021-07-13T11:08:47Z) - Deep Transformer Models for Time Series Forecasting: The Influenza
Prevalence Case [2.997238772148965]
Time series data are prevalent in many scientific and engineering disciplines.
We present a new approach to time series forecasting using Transformer-based machine learning models.
We show that the forecasting results produced by our approach are favorably comparable to the state-of-the-art.
arXiv Detail & Related papers (2020-01-23T00:22:22Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.