TimeTuner: Diagnosing Time Representations for Time-Series Forecasting
with Counterfactual Explanations
- URL: http://arxiv.org/abs/2307.09916v3
- Date: Thu, 27 Jul 2023 04:19:43 GMT
- Title: TimeTuner: Diagnosing Time Representations for Time-Series Forecasting
with Counterfactual Explanations
- Authors: Jianing Hao, Qing Shi, Yilin Ye, and Wei Zeng
- Abstract summary: This paper contributes a novel visual analytics framework, namely TimeTuner, to help analysts understand how model behaviors are associated with localized, stationarity, and correlations of time-series representations.
We show that TimeTuner can help characterize time-series representations and guide the feature engineering processes.
- Score: 3.8357850372472915
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Deep learning (DL) approaches are being increasingly used for time-series
forecasting, with many efforts devoted to designing complex DL models. Recent
studies have shown that the DL success is often attributed to effective data
representations, fostering the fields of feature engineering and representation
learning. However, automated approaches for feature learning are typically
limited with respect to incorporating prior knowledge, identifying interactions
among variables, and choosing evaluation metrics to ensure that the models are
reliable. To improve on these limitations, this paper contributes a novel
visual analytics framework, namely TimeTuner, designed to help analysts
understand how model behaviors are associated with localized correlations,
stationarity, and granularity of time-series representations. The system mainly
consists of the following two-stage technique: We first leverage counterfactual
explanations to connect the relationships among time-series representations,
multivariate features and model predictions. Next, we design multiple
coordinated views including a partition-based correlation matrix and juxtaposed
bivariate stripes, and provide a set of interactions that allow users to step
into the transformation selection process, navigate through the feature space,
and reason the model performance. We instantiate TimeTuner with two
transformation methods of smoothing and sampling, and demonstrate its
applicability on real-world time-series forecasting of univariate sunspots and
multivariate air pollutants. Feedback from domain experts indicates that our
system can help characterize time-series representations and guide the feature
engineering processes.
Related papers
- TSFeatLIME: An Online User Study in Enhancing Explainability in Univariate Time Series Forecasting [1.9314780151274307]
This paper presents a framework - TSFeatLIME, extending TSLIME.
TSFeatLIME integrates an auxiliary feature into the surrogate model and considers the pairwise Euclidean distances between the queried time series and the generated samples.
Results show that the surrogate model under the TSFeatLIME framework is able to better simulate the behaviour of the black-box considering distance, without sacrificing accuracy.
arXiv Detail & Related papers (2024-09-24T10:24:53Z) - PDETime: Rethinking Long-Term Multivariate Time Series Forecasting from
the perspective of partial differential equations [49.80959046861793]
We present PDETime, a novel LMTF model inspired by the principles of Neural PDE solvers.
Our experimentation across seven diversetemporal real-world LMTF datasets reveals that PDETime adapts effectively to the intrinsic nature of the data.
arXiv Detail & Related papers (2024-02-25T17:39:44Z) - Multi-Patch Prediction: Adapting LLMs for Time Series Representation
Learning [22.28251586213348]
aLLM4TS is an innovative framework that adapts Large Language Models (LLMs) for time-series representation learning.
A distinctive element of our framework is the patch-wise decoding layer, which departs from previous methods reliant on sequence-level decoding.
arXiv Detail & Related papers (2024-02-07T13:51:26Z) - TACTiS-2: Better, Faster, Simpler Attentional Copulas for Multivariate Time Series [57.4208255711412]
Building on copula theory, we propose a simplified objective for the recently-introduced transformer-based attentional copulas (TACTiS)
We show that the resulting model has significantly better training dynamics and achieves state-of-the-art performance across diverse real-world forecasting tasks.
arXiv Detail & Related papers (2023-10-02T16:45:19Z) - Multi-scale Attention Flow for Probabilistic Time Series Forecasting [68.20798558048678]
We propose a novel non-autoregressive deep learning model, called Multi-scale Attention Normalizing Flow(MANF)
Our model avoids the influence of cumulative error and does not increase the time complexity.
Our model achieves state-of-the-art performance on many popular multivariate datasets.
arXiv Detail & Related papers (2022-05-16T07:53:42Z) - Temporal Relevance Analysis for Video Action Models [70.39411261685963]
We first propose a new approach to quantify the temporal relationships between frames captured by CNN-based action models.
We then conduct comprehensive experiments and in-depth analysis to provide a better understanding of how temporal modeling is affected.
arXiv Detail & Related papers (2022-04-25T19:06:48Z) - Temporal Knowledge Graph Reasoning with Low-rank and Model-agnostic
Representations [1.8262547855491458]
We introduce Time-LowFER, a family of parameter-efficient and time-aware extensions of the low-rank tensor factorization model LowFER.
Noting several limitations in current approaches to represent time, we propose a cycle-aware time-encoding scheme for time features.
We implement our methods in a unified temporal knowledge graph embedding framework, focusing on time-sensitive data processing.
arXiv Detail & Related papers (2022-04-10T22:24:11Z) - Self-Attention Neural Bag-of-Features [103.70855797025689]
We build on the recently introduced 2D-Attention and reformulate the attention learning methodology.
We propose a joint feature-temporal attention mechanism that learns a joint 2D attention mask highlighting relevant information.
arXiv Detail & Related papers (2022-01-26T17:54:14Z) - PSEUDo: Interactive Pattern Search in Multivariate Time Series with
Locality-Sensitive Hashing and Relevance Feedback [3.347485580830609]
PSEUDo is an adaptive feature learning technique for exploring visual patterns in multi-track sequential data.
Our algorithm features sub-linear training and inference time.
We demonstrate superiority of PSEUDo in terms of efficiency, accuracy, and steerability.
arXiv Detail & Related papers (2021-04-30T13:00:44Z) - Multivariate Time-series Anomaly Detection via Graph Attention Network [27.12694738711663]
Anomaly detection on multivariate time-series is of great importance in both data mining research and industrial applications.
One major limitation is that they do not capture the relationships between different time-series explicitly.
We propose a novel self-supervised framework for multivariate time-series anomaly detection to address this issue.
arXiv Detail & Related papers (2020-09-04T07:46:19Z) - Connecting the Dots: Multivariate Time Series Forecasting with Graph
Neural Networks [91.65637773358347]
We propose a general graph neural network framework designed specifically for multivariate time series data.
Our approach automatically extracts the uni-directed relations among variables through a graph learning module.
Our proposed model outperforms the state-of-the-art baseline methods on 3 of 4 benchmark datasets.
arXiv Detail & Related papers (2020-05-24T04:02:18Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.