Finding Short Signals in Long Irregular Time Series with Continuous-Time
Attention Policy Networks
- URL: http://arxiv.org/abs/2302.04052v1
- Date: Wed, 8 Feb 2023 13:44:36 GMT
- Title: Finding Short Signals in Long Irregular Time Series with Continuous-Time
Attention Policy Networks
- Authors: Thomas Hartvigsen, Jidapa Thadajarassiri, Xiangnan Kong, Elke
Rundensteiner
- Abstract summary: Irregularly-sampled time series (ITS) are native to high-impact domains like healthcare, where measurements are collected over time at uneven intervals.
We propose CAT, a model that classifies multivariate ITS by explicitly seeking highly-relevant portions of an input series' timeline.
Using synthetic and real data, we find that CAT outperforms ten state-of-the-art methods by finding short signals in long irregular time series.
- Score: 18.401817124823832
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Irregularly-sampled time series (ITS) are native to high-impact domains like
healthcare, where measurements are collected over time at uneven intervals.
However, for many classification problems, only small portions of long time
series are often relevant to the class label. In this case, existing ITS models
often fail to classify long series since they rely on careful imputation, which
easily over- or under-samples the relevant regions. Using this insight, we then
propose CAT, a model that classifies multivariate ITS by explicitly seeking
highly-relevant portions of an input series' timeline. CAT achieves this by
integrating three components: (1) A Moment Network learns to seek relevant
moments in an ITS's continuous timeline using reinforcement learning. (2) A
Receptor Network models the temporal dynamics of both observations and their
timing localized around predicted moments. (3) A recurrent Transition Model
models the sequence of transitions between these moments, cultivating a
representation with which the series is classified. Using synthetic and real
data, we find that CAT outperforms ten state-of-the-art methods by finding
short signals in long irregular time series.
Related papers
- TimeSiam: A Pre-Training Framework for Siamese Time-Series Modeling [67.02157180089573]
Time series pre-training has recently garnered wide attention for its potential to reduce labeling expenses and benefit various downstream tasks.
This paper proposes TimeSiam as a simple but effective self-supervised pre-training framework for Time series based on Siamese networks.
arXiv Detail & Related papers (2024-02-04T13:10:51Z) - Graph Spatiotemporal Process for Multivariate Time Series Anomaly
Detection with Missing Values [67.76168547245237]
We introduce a novel framework called GST-Pro, which utilizes a graphtemporal process and anomaly scorer to detect anomalies.
Our experimental results show that the GST-Pro method can effectively detect anomalies in time series data and outperforms state-of-the-art methods.
arXiv Detail & Related papers (2024-01-11T10:10:16Z) - TimesNet: Temporal 2D-Variation Modeling for General Time Series
Analysis [80.56913334060404]
Time series analysis is of immense importance in applications, such as weather forecasting, anomaly detection, and action recognition.
Previous methods attempt to accomplish this directly from the 1D time series.
We ravel out the complex temporal variations into the multiple intraperiod- and interperiod-variations.
arXiv Detail & Related papers (2022-10-05T12:19:51Z) - STING: Self-attention based Time-series Imputation Networks using GAN [4.052758394413726]
STING (Self-attention based Time-series Imputation Networks using GAN) is proposed.
We take advantage of generative adversarial networks and bidirectional recurrent neural networks to learn latent representations of the time series.
Experimental results on three real-world datasets demonstrate that STING outperforms the existing state-of-the-art methods in terms of imputation accuracy.
arXiv Detail & Related papers (2022-09-22T06:06:56Z) - HyperTime: Implicit Neural Representation for Time Series [131.57172578210256]
Implicit neural representations (INRs) have recently emerged as a powerful tool that provides an accurate and resolution-independent encoding of data.
In this paper, we analyze the representation of time series using INRs, comparing different activation functions in terms of reconstruction accuracy and training convergence speed.
We propose a hypernetwork architecture that leverages INRs to learn a compressed latent representation of an entire time series dataset.
arXiv Detail & Related papers (2022-08-11T14:05:51Z) - Multi-scale Anomaly Detection for Big Time Series of Industrial Sensors [50.6434162489902]
We propose a reconstruction-based anomaly detection method, MissGAN, iteratively learning to decode and encode naturally smooth time series.
MissGAN does not need labels or only needs labels of normal instances, making it widely applicable.
arXiv Detail & Related papers (2022-04-18T04:34:15Z) - Novel Features for Time Series Analysis: A Complex Networks Approach [62.997667081978825]
Time series data are ubiquitous in several domains as climate, economics and health care.
Recent conceptual approach relies on time series mapping to complex networks.
Network analysis can be used to characterize different types of time series.
arXiv Detail & Related papers (2021-10-11T13:46:28Z) - Modeling Regime Shifts in Multiple Time Series [1.4588552933974936]
Regime shifts refer to the changing behaviors exhibited by series at different time intervals.
Existing methods fail to take relationships between time series into consideration for discovering regimes in multiple time series.
Most of the existing methods are unable to handle all of these three issues in a unified framework.
arXiv Detail & Related papers (2021-09-20T17:02:29Z) - Multi-Time Attention Networks for Irregularly Sampled Time Series [18.224344440110862]
Irregular sampling occurs in many time series modeling applications.
We propose a new deep learning framework for this setting that we call Multi-Time Attention Networks.
Our results show that our approach performs as well or better than a range of baseline and recently proposed models.
arXiv Detail & Related papers (2021-01-25T18:57:42Z) - Learning from Irregularly-Sampled Time Series: A Missing Data
Perspective [18.493394650508044]
Irregularly-sampled time series occur in many domains including healthcare.
We model irregularly-sampled time series data as a sequence of index-value pairs sampled from a continuous but unobserved function.
We propose learning methods for this framework based on variational autoencoders and generative adversarial networks.
arXiv Detail & Related papers (2020-08-17T20:01:55Z) - A Deep Structural Model for Analyzing Correlated Multivariate Time
Series [11.009809732645888]
We present a deep learning structural time series model which can handle correlated multivariate time series input.
The model explicitly learns/extracts the trend, seasonality, and event components.
We compare our model with several state-of-the-art methods through a comprehensive set of experiments on a variety of time series data sets.
arXiv Detail & Related papers (2020-01-02T18:48:29Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.