Series Saliency: Temporal Interpretation for Multivariate Time Series
Forecasting
- URL: http://arxiv.org/abs/2012.09324v1
- Date: Wed, 16 Dec 2020 23:48:00 GMT
- Title: Series Saliency: Temporal Interpretation for Multivariate Time Series
Forecasting
- Authors: Qingyi Pan, Wenbo Hu, Jun Zhu
- Abstract summary: We present the series saliency framework for temporal interpretation for time series forecasting.
By extracting the "series images" from the sliding windows of the time series, we apply the saliency map segmentation.
Our framework generates temporal interpretations for the time series forecasting task while produces accurate time series forecast.
- Score: 30.054015098590874
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Time series forecasting is an important yet challenging task. Though deep
learning methods have recently been developed to give superior forecasting
results, it is crucial to improve the interpretability of time series models.
Previous interpretation methods, including the methods for general neural
networks and attention-based methods, mainly consider the interpretation in the
feature dimension while ignoring the crucial temporal dimension. In this paper,
we present the series saliency framework for temporal interpretation for
multivariate time series forecasting, which considers the forecasting
interpretation in both feature and temporal dimensions. By extracting the
"series images" from the sliding windows of the time series, we apply the
saliency map segmentation following the smallest destroying region principle.
The series saliency framework can be employed to any well-defined deep learning
models and works as a data augmentation to get more accurate forecasts.
Experimental results on several real datasets demonstrate that our framework
generates temporal interpretations for the time series forecasting task while
produces accurate time series forecast.
Related papers
- Unified Training of Universal Time Series Forecasting Transformers [104.56318980466742]
We present a Masked-based Universal Time Series Forecasting Transformer (Moirai)
Moirai is trained on our newly introduced Large-scale Open Time Series Archive (LOTSA) featuring over 27B observations across nine domains.
Moirai achieves competitive or superior performance as a zero-shot forecaster when compared to full-shot models.
arXiv Detail & Related papers (2024-02-04T20:00:45Z) - TimeSiam: A Pre-Training Framework for Siamese Time-Series Modeling [67.02157180089573]
Time series pre-training has recently garnered wide attention for its potential to reduce labeling expenses and benefit various downstream tasks.
This paper proposes TimeSiam as a simple but effective self-supervised pre-training framework for Time series based on Siamese networks.
arXiv Detail & Related papers (2024-02-04T13:10:51Z) - Pushing the Limits of Pre-training for Time Series Forecasting in the
CloudOps Domain [54.67888148566323]
We introduce three large-scale time series forecasting datasets from the cloud operations domain.
We show it is a strong zero-shot baseline and benefits from further scaling, both in model and dataset size.
Accompanying these datasets and results is a suite of comprehensive benchmark results comparing classical and deep learning baselines to our pre-trained method.
arXiv Detail & Related papers (2023-10-08T08:09:51Z) - MPPN: Multi-Resolution Periodic Pattern Network For Long-Term Time
Series Forecasting [19.573651104129443]
Long-term time series forecasting plays an important role in various real-world scenarios.
Recent deep learning methods for long-term series forecasting tend to capture the intricate patterns of time series by decomposition-based or sampling-based methods.
We propose a novel deep learning network architecture, named Multi-resolution Periodic Pattern Network (MPPN), for long-term series forecasting.
arXiv Detail & Related papers (2023-06-12T07:00:37Z) - Ripple: Concept-Based Interpretation for Raw Time Series Models in
Education [5.374524134699487]
Time series is the most prevalent form of input data for educational prediction tasks.
We propose an approach that utilizes irregular multivariate time series modeling with graph neural networks to achieve comparable or better accuracy.
We analyze these advances in the education domain, addressing the task of early student performance prediction.
arXiv Detail & Related papers (2022-12-02T12:26:00Z) - Respecting Time Series Properties Makes Deep Time Series Forecasting
Perfect [3.830797055092574]
How to handle time features shall be the core question of any time series forecasting model.
In this paper, we rigorously analyze three prevalent but deficient/unfounded deep time series forecasting mechanisms.
We propose a novel time series forecasting network, i.e. RTNet, on the basis of aforementioned analysis.
arXiv Detail & Related papers (2022-07-22T08:34:31Z) - Split Time Series into Patches: Rethinking Long-term Series Forecasting
with Dateformer [17.454822366228335]
Time is one of the most significant characteristics of time-series, yet has received insufficient attention.
We propose Dateformer who turns attention to modeling time instead of following the above practice.
Dateformer yields state-of-the-art accuracy with a 40% remarkable relative improvement, and broadens the maximum credible forecasting range to a half-yearly level.
arXiv Detail & Related papers (2022-07-12T08:58:44Z) - Monitoring Time Series With Missing Values: a Deep Probabilistic
Approach [1.90365714903665]
We introduce a new architecture for time series monitoring based on combination of state-of-the-art methods of forecasting in high-dimensional time series with full probabilistic handling of uncertainty.
We demonstrate advantage of the architecture for time series forecasting and novelty detection, in particular with partially missing data, and empirically evaluate and compare the architecture to state-of-the-art approaches on a real-world data set.
arXiv Detail & Related papers (2022-03-09T17:53:47Z) - TACTiS: Transformer-Attentional Copulas for Time Series [76.71406465526454]
estimation of time-varying quantities is a fundamental component of decision making in fields such as healthcare and finance.
We propose a versatile method that estimates joint distributions using an attention-based decoder.
We show that our model produces state-of-the-art predictions on several real-world datasets.
arXiv Detail & Related papers (2022-02-07T21:37:29Z) - Cluster-and-Conquer: A Framework For Time-Series Forecasting [94.63501563413725]
We propose a three-stage framework for forecasting high-dimensional time-series data.
Our framework is highly general, allowing for any time-series forecasting and clustering method to be used in each step.
When instantiated with simple linear autoregressive models, we are able to achieve state-of-the-art results on several benchmark datasets.
arXiv Detail & Related papers (2021-10-26T20:41:19Z) - Predicting Temporal Sets with Deep Neural Networks [50.53727580527024]
We propose an integrated solution based on the deep neural networks for temporal sets prediction.
A unique perspective is to learn element relationship by constructing set-level co-occurrence graph.
We design an attention-based module to adaptively learn the temporal dependency of elements and sets.
arXiv Detail & Related papers (2020-06-20T03:29:02Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.