Benchmarking Deep Learning Interpretability in Time Series Predictions
- URL: http://arxiv.org/abs/2010.13924v1
- Date: Mon, 26 Oct 2020 22:07:53 GMT
- Title: Benchmarking Deep Learning Interpretability in Time Series Predictions
- Authors: Aya Abdelsalam Ismail, Mohamed Gunady, H\'ector Corrada Bravo, and
Soheil Feizi
- Abstract summary: Saliency methods are used extensively to highlight the importance of input features in model predictions.
We set out to extensively compare the performance of various saliency-based interpretability methods across diverse neural architectures.
- Score: 41.13847656750174
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Saliency methods are used extensively to highlight the importance of input
features in model predictions. These methods are mostly used in vision and
language tasks, and their applications to time series data is relatively
unexplored. In this paper, we set out to extensively compare the performance of
various saliency-based interpretability methods across diverse neural
architectures, including Recurrent Neural Network, Temporal Convolutional
Networks, and Transformers in a new benchmark of synthetic time series data. We
propose and report multiple metrics to empirically evaluate the performance of
saliency methods for detecting feature importance over time using both
precision (i.e., whether identified features contain meaningful signals) and
recall (i.e., the number of features with signal identified as important).
Through several experiments, we show that (i) in general, network architectures
and saliency methods fail to reliably and accurately identify feature
importance over time in time series data, (ii) this failure is mainly due to
the conflation of time and feature domains, and (iii) the quality of saliency
maps can be improved substantially by using our proposed two-step temporal
saliency rescaling (TSR) approach that first calculates the importance of each
time step before calculating the importance of each feature at a time step.
Related papers
- TSI-Bench: Benchmarking Time Series Imputation [52.27004336123575]
TSI-Bench is a comprehensive benchmark suite for time series imputation utilizing deep learning techniques.
The TSI-Bench pipeline standardizes experimental settings to enable fair evaluation of imputation algorithms.
TSI-Bench innovatively provides a systematic paradigm to tailor time series forecasting algorithms for imputation purposes.
arXiv Detail & Related papers (2024-06-18T16:07:33Z) - Graph Spatiotemporal Process for Multivariate Time Series Anomaly
Detection with Missing Values [67.76168547245237]
We introduce a novel framework called GST-Pro, which utilizes a graphtemporal process and anomaly scorer to detect anomalies.
Our experimental results show that the GST-Pro method can effectively detect anomalies in time series data and outperforms state-of-the-art methods.
arXiv Detail & Related papers (2024-01-11T10:10:16Z) - Multivariate Time Series Classification: A Deep Learning Approach [1.0742675209112622]
This paper investigates different methods and various neural network architectures applicable in the time series classification domain.
Data is obtained from a fleet of gas sensors that measure and track quantities such as oxygen and sound.
With the help of this data, we can detect events such as occupancy in a specific environment.
arXiv Detail & Related papers (2023-07-05T12:50:48Z) - Self-Attention Neural Bag-of-Features [103.70855797025689]
We build on the recently introduced 2D-Attention and reformulate the attention learning methodology.
We propose a joint feature-temporal attention mechanism that learns a joint 2D attention mask highlighting relevant information.
arXiv Detail & Related papers (2022-01-26T17:54:14Z) - Novel Features for Time Series Analysis: A Complex Networks Approach [62.997667081978825]
Time series data are ubiquitous in several domains as climate, economics and health care.
Recent conceptual approach relies on time series mapping to complex networks.
Network analysis can be used to characterize different types of time series.
arXiv Detail & Related papers (2021-10-11T13:46:28Z) - Multivariate Time Series Imputation by Graph Neural Networks [13.308026049048717]
We introduce a graph neural network architecture, named GRIL, which aims at reconstructing missing data in different channels of a multivariate time series.
Preliminary results show that our model outperforms state-of-the-art methods in the imputation task on relevant benchmarks.
arXiv Detail & Related papers (2021-07-31T17:47:10Z) - Temporal Dependencies in Feature Importance for Time Series Predictions [4.082348823209183]
We propose WinIT, a framework for evaluating feature importance in time series prediction settings.
We demonstrate how the solution improves the appropriate attribution of features within time steps.
WinIT achieves 2.47x better performance than FIT and other feature importance methods on real-world clinical MIMIC-mortality task.
arXiv Detail & Related papers (2021-07-29T20:31:03Z) - PSEUDo: Interactive Pattern Search in Multivariate Time Series with
Locality-Sensitive Hashing and Relevance Feedback [3.347485580830609]
PSEUDo is an adaptive feature learning technique for exploring visual patterns in multi-track sequential data.
Our algorithm features sub-linear training and inference time.
We demonstrate superiority of PSEUDo in terms of efficiency, accuracy, and steerability.
arXiv Detail & Related papers (2021-04-30T13:00:44Z) - Deep Cellular Recurrent Network for Efficient Analysis of Time-Series
Data with Spatial Information [52.635997570873194]
This work proposes a novel deep cellular recurrent neural network (DCRNN) architecture to process complex multi-dimensional time series data with spatial information.
The proposed architecture achieves state-of-the-art performance while utilizing substantially less trainable parameters when compared to comparable methods in the literature.
arXiv Detail & Related papers (2021-01-12T20:08:18Z) - Deep ConvLSTM with self-attention for human activity decoding using
wearables [0.0]
We propose a deep neural network architecture that captures features of multiple sensor time-series data but also selects important time points.
We show the validity of the proposed approach across different data sampling strategies and demonstrate that the self-attention mechanism gave a significant improvement.
The proposed methods open avenues for better decoding of human activity from multiple body sensors over extended periods time.
arXiv Detail & Related papers (2020-05-02T04:30:31Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.