Mimic: An adaptive algorithm for multivariate time series classification
- URL: http://arxiv.org/abs/2111.04273v1
- Date: Mon, 8 Nov 2021 04:47:31 GMT
- Title: Mimic: An adaptive algorithm for multivariate time series classification
- Authors: Yuhui Wang, Diane J. Cook
- Abstract summary: Time series data are valuable but are often inscrutable.
Gaining trust in time series classifiers for finance, healthcare, and other critical applications may rely on creating interpretable models.
We propose a novel Mimic algorithm that retains the predictive accuracy of the strongest classifiers while introducing interpretability.
- Score: 11.49627617337276
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Time series data are valuable but are often inscrutable. Gaining trust in
time series classifiers for finance, healthcare, and other critical
applications may rely on creating interpretable models. Researchers have
previously been forced to decide between interpretable methods that lack
predictive power and deep learning methods that lack transparency. In this
paper, we propose a novel Mimic algorithm that retains the predictive accuracy
of the strongest classifiers while introducing interpretability. Mimic mirrors
the learning method of an existing multivariate time series classifier while
simultaneously producing a visual representation that enhances user
understanding of the learned model. Experiments on 26 time series datasets
support Mimic's ability to imitate a variety of time series classifiers
visually and accurately.
Related papers
- Advancing Time Series Classification with Multimodal Language Modeling [6.624754582682479]
We propose InstructTime, a novel attempt to reshape time series classification as a learning-to-generate paradigm.
The core idea is to formulate the classification of time series as a multimodal understanding task, in which both task-specific instructions and raw time series are treated as multimodal inputs.
Extensive experiments are conducted over benchmark datasets, whose results uncover the superior performance of InstructTime.
arXiv Detail & Related papers (2024-03-19T02:32:24Z) - TimeSiam: A Pre-Training Framework for Siamese Time-Series Modeling [67.02157180089573]
Time series pre-training has recently garnered wide attention for its potential to reduce labeling expenses and benefit various downstream tasks.
This paper proposes TimeSiam as a simple but effective self-supervised pre-training framework for Time series based on Siamese networks.
arXiv Detail & Related papers (2024-02-04T13:10:51Z) - Time Series Contrastive Learning with Information-Aware Augmentations [57.45139904366001]
A key component of contrastive learning is to select appropriate augmentations imposing some priors to construct feasible positive samples.
How to find the desired augmentations of time series data that are meaningful for given contrastive learning tasks and datasets remains an open question.
We propose a new contrastive learning approach with information-aware augmentations, InfoTS, that adaptively selects optimal augmentations for time series representation learning.
arXiv Detail & Related papers (2023-03-21T15:02:50Z) - TimeMAE: Self-Supervised Representations of Time Series with Decoupled
Masked Autoencoders [55.00904795497786]
We propose TimeMAE, a novel self-supervised paradigm for learning transferrable time series representations based on transformer networks.
The TimeMAE learns enriched contextual representations of time series with a bidirectional encoding scheme.
To solve the discrepancy issue incurred by newly injected masked embeddings, we design a decoupled autoencoder architecture.
arXiv Detail & Related papers (2023-03-01T08:33:16Z) - Ti-MAE: Self-Supervised Masked Time Series Autoencoders [16.98069693152999]
We propose a novel framework named Ti-MAE, in which the input time series are assumed to follow an integrate distribution.
Ti-MAE randomly masks out embedded time series data and learns an autoencoder to reconstruct them at the point-level.
Experiments on several public real-world datasets demonstrate that our framework of masked autoencoding could learn strong representations directly from the raw data.
arXiv Detail & Related papers (2023-01-21T03:20:23Z) - Multi-scale Attention Flow for Probabilistic Time Series Forecasting [68.20798558048678]
We propose a novel non-autoregressive deep learning model, called Multi-scale Attention Normalizing Flow(MANF)
Our model avoids the influence of cumulative error and does not increase the time complexity.
Our model achieves state-of-the-art performance on many popular multivariate datasets.
arXiv Detail & Related papers (2022-05-16T07:53:42Z) - Cluster-and-Conquer: A Framework For Time-Series Forecasting [94.63501563413725]
We propose a three-stage framework for forecasting high-dimensional time-series data.
Our framework is highly general, allowing for any time-series forecasting and clustering method to be used in each step.
When instantiated with simple linear autoregressive models, we are able to achieve state-of-the-art results on several benchmark datasets.
arXiv Detail & Related papers (2021-10-26T20:41:19Z) - Temporal Dependencies in Feature Importance for Time Series Predictions [4.082348823209183]
We propose WinIT, a framework for evaluating feature importance in time series prediction settings.
We demonstrate how the solution improves the appropriate attribution of features within time steps.
WinIT achieves 2.47x better performance than FIT and other feature importance methods on real-world clinical MIMIC-mortality task.
arXiv Detail & Related papers (2021-07-29T20:31:03Z) - Contrastive learning of strong-mixing continuous-time stochastic
processes [53.82893653745542]
Contrastive learning is a family of self-supervised methods where a model is trained to solve a classification task constructed from unlabeled data.
We show that a properly constructed contrastive learning task can be used to estimate the transition kernel for small-to-mid-range intervals in the diffusion case.
arXiv Detail & Related papers (2021-03-03T23:06:47Z) - Explainable Multivariate Time Series Classification: A Deep Neural
Network Which Learns To Attend To Important Variables As Well As Informative
Time Intervals [32.30627405832656]
Time series data is prevalent in a wide variety of real-world applications.
Key criterion to understand such predictive models involves elucidating and quantifying the contribution of time varying input variables to the classification.
We introduce a novel, modular, convolution-based feature extraction and attention mechanism that simultaneously identifies the variables as well as time intervals which determine the classification output.
arXiv Detail & Related papers (2020-11-23T19:16:46Z) - Interpretable Time Series Classification using Linear Models and
Multi-resolution Multi-domain Symbolic Representations [6.6147550436077776]
We propose new time series classification algorithms to address gaps in current approaches.
Our approach is based on symbolic representations of time series, efficient sequence mining algorithms and linear classification models.
Our models are as accurate as deep learning models but are more efficient regarding running time and memory, can work with variable-length time series and can be interpreted by highlighting the discriminative symbolic features on the original time series.
arXiv Detail & Related papers (2020-05-31T15:32:08Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.