Interpretable Feature Construction for Time Series Extrinsic Regression
- URL: http://arxiv.org/abs/2103.10247v1
- Date: Mon, 15 Mar 2021 08:12:19 GMT
- Title: Interpretable Feature Construction for Time Series Extrinsic Regression
- Authors: Dominique Gay, Alexis Bondu, Vincent Lemaire, Marc Boull\'e
- Abstract summary: In some application domains, it occurs that the target variable is numerical and the problem is known as time series extrinsic regression (TSER)
We suggest an extension of a Bayesian method for robust and interpretable feature construction and selection in the context of TSER.
Our approach exploits a relational way to tackle with TSER: (i), we build various and simple representations of the time series which are stored in a relational data scheme, then, (ii), a propositionalisation technique is applied to build interpretable features from secondary tables to "flatten" the data.
- Score: 0.028675177318965035
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Supervised learning of time series data has been extensively studied for the
case of a categorical target variable. In some application domains, e.g.,
energy, environment and health monitoring, it occurs that the target variable
is numerical and the problem is known as time series extrinsic regression
(TSER). In the literature, some well-known time series classifiers have been
extended for TSER problems. As first benchmarking studies have focused on
predictive performance, very little attention has been given to
interpretability. To fill this gap, in this paper, we suggest an extension of a
Bayesian method for robust and interpretable feature construction and selection
in the context of TSER. Our approach exploits a relational way to tackle with
TSER: (i), we build various and simple representations of the time series which
are stored in a relational data scheme, then, (ii), a propositionalisation
technique (based on classical aggregation / selection functions from the
relational data field) is applied to build interpretable features from
secondary tables to "flatten" the data; and (iii), the constructed features are
filtered out through a Bayesian Maximum A Posteriori approach. The resulting
transformed data can be processed with various existing regressors.
Experimental validation on various benchmark data sets demonstrates the
benefits of the suggested approach.
Related papers
- Metadata Matters for Time Series: Informative Forecasting with Transformers [70.38241681764738]
We propose a Metadata-informed Time Series Transformer (MetaTST) for time series forecasting.
To tackle the unstructured nature of metadata, MetaTST formalizes them into natural languages by pre-designed templates.
A Transformer encoder is employed to communicate series and metadata tokens, which can extend series representations by metadata information.
arXiv Detail & Related papers (2024-10-04T11:37:55Z) - TimeSiam: A Pre-Training Framework for Siamese Time-Series Modeling [67.02157180089573]
Time series pre-training has recently garnered wide attention for its potential to reduce labeling expenses and benefit various downstream tasks.
This paper proposes TimeSiam as a simple but effective self-supervised pre-training framework for Time series based on Siamese networks.
arXiv Detail & Related papers (2024-02-04T13:10:51Z) - Temporal Treasure Hunt: Content-based Time Series Retrieval System for
Discovering Insights [34.1973242428317]
Time series data is ubiquitous across various domains such as finance, healthcare, and manufacturing.
The ability to perform Content-based Time Series Retrieval (CTSR) is crucial for identifying unknown time series examples.
We introduce a CTSR benchmark dataset that comprises time series data from a variety of domains.
arXiv Detail & Related papers (2023-11-05T04:12:13Z) - Few-Shot Forecasting of Time-Series with Heterogeneous Channels [4.635820333232681]
We develop a model composed of permutation-invariant deep set-blocks which incorporate a temporal embedding.
We show through experiments that our model provides a good generalization, outperforming baselines carried over from simpler scenarios.
arXiv Detail & Related papers (2022-04-07T14:02:15Z) - TACTiS: Transformer-Attentional Copulas for Time Series [76.71406465526454]
estimation of time-varying quantities is a fundamental component of decision making in fields such as healthcare and finance.
We propose a versatile method that estimates joint distributions using an attention-based decoder.
We show that our model produces state-of-the-art predictions on several real-world datasets.
arXiv Detail & Related papers (2022-02-07T21:37:29Z) - Optimal Latent Space Forecasting for Large Collections of Short Time
Series Using Temporal Matrix Factorization [0.0]
It is a common practice to evaluate multiple methods and choose one of these methods or an ensemble for producing the best forecasts.
We propose a framework for forecasting short high-dimensional time series data by combining low-rank temporal matrix factorization and optimal model selection on latent time series.
arXiv Detail & Related papers (2021-12-15T11:39:21Z) - Cluster-and-Conquer: A Framework For Time-Series Forecasting [94.63501563413725]
We propose a three-stage framework for forecasting high-dimensional time-series data.
Our framework is highly general, allowing for any time-series forecasting and clustering method to be used in each step.
When instantiated with simple linear autoregressive models, we are able to achieve state-of-the-art results on several benchmark datasets.
arXiv Detail & Related papers (2021-10-26T20:41:19Z) - Novel Features for Time Series Analysis: A Complex Networks Approach [62.997667081978825]
Time series data are ubiquitous in several domains as climate, economics and health care.
Recent conceptual approach relies on time series mapping to complex networks.
Network analysis can be used to characterize different types of time series.
arXiv Detail & Related papers (2021-10-11T13:46:28Z) - Instance-wise Graph-based Framework for Multivariate Time Series
Forecasting [69.38716332931986]
We propose a simple yet efficient instance-wise graph-based framework to utilize the inter-dependencies of different variables at different time stamps.
The key idea of our framework is aggregating information from the historical time series of different variables to the current time series that we need to forecast.
arXiv Detail & Related papers (2021-09-14T07:38:35Z) - Time Series is a Special Sequence: Forecasting with Sample Convolution
and Interaction [9.449017120452675]
Time series is a special type of sequence data, a set of observations collected at even intervals of time and ordered chronologically.
Existing deep learning techniques use generic sequence models for time series analysis, which ignore some of its unique properties.
We propose a novel neural network architecture and apply it for the time series forecasting problem, wherein we conduct sample convolution and interaction at multiple resolutions for temporal modeling.
arXiv Detail & Related papers (2021-06-17T08:15:04Z) - Explainable Multivariate Time Series Classification: A Deep Neural
Network Which Learns To Attend To Important Variables As Well As Informative
Time Intervals [32.30627405832656]
Time series data is prevalent in a wide variety of real-world applications.
Key criterion to understand such predictive models involves elucidating and quantifying the contribution of time varying input variables to the classification.
We introduce a novel, modular, convolution-based feature extraction and attention mechanism that simultaneously identifies the variables as well as time intervals which determine the classification output.
arXiv Detail & Related papers (2020-11-23T19:16:46Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.