Warpformer: A Multi-scale Modeling Approach for Irregular Clinical Time
Series
- URL: http://arxiv.org/abs/2306.09368v1
- Date: Wed, 14 Jun 2023 13:23:14 GMT
- Title: Warpformer: A Multi-scale Modeling Approach for Irregular Clinical Time
Series
- Authors: Jiawen Zhang, Shun Zheng, Wei Cao, Jiang Bian, Jia Li
- Abstract summary: Intra-series irregularity and inter-series discrepancy are key characteristics of irregular time series.
We present Warpformer, a novel approach that fully considers these two characteristics.
We conduct extensive experiments on widely used datasets and a new large-scale benchmark built from clinical databases.
- Score: 29.838484652943976
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Irregularly sampled multivariate time series are ubiquitous in various
fields, particularly in healthcare, and exhibit two key characteristics:
intra-series irregularity and inter-series discrepancy. Intra-series
irregularity refers to the fact that time-series signals are often recorded at
irregular intervals, while inter-series discrepancy refers to the significant
variability in sampling rates among diverse series. However, recent advances in
irregular time series have primarily focused on addressing intra-series
irregularity, overlooking the issue of inter-series discrepancy. To bridge this
gap, we present Warpformer, a novel approach that fully considers these two
characteristics. In a nutshell, Warpformer has several crucial designs,
including a specific input representation that explicitly characterizes both
intra-series irregularity and inter-series discrepancy, a warping module that
adaptively unifies irregular time series in a given scale, and a customized
attention module for representation learning. Additionally, we stack multiple
warping and attention modules to learn at different scales, producing
multi-scale representations that balance coarse-grained and fine-grained
signals for downstream tasks. We conduct extensive experiments on widely used
datasets and a new large-scale benchmark built from clinical databases. The
results demonstrate the superiority of Warpformer over existing
state-of-the-art approaches.
Related papers
- TimeSiam: A Pre-Training Framework for Siamese Time-Series Modeling [67.02157180089573]
Time series pre-training has recently garnered wide attention for its potential to reduce labeling expenses and benefit various downstream tasks.
This paper proposes TimeSiam as a simple but effective self-supervised pre-training framework for Time series based on Siamese networks.
arXiv Detail & Related papers (2024-02-04T13:10:51Z) - Graph Spatiotemporal Process for Multivariate Time Series Anomaly
Detection with Missing Values [67.76168547245237]
We introduce a novel framework called GST-Pro, which utilizes a graphtemporal process and anomaly scorer to detect anomalies.
Our experimental results show that the GST-Pro method can effectively detect anomalies in time series data and outperforms state-of-the-art methods.
arXiv Detail & Related papers (2024-01-11T10:10:16Z) - MSGNet: Learning Multi-Scale Inter-Series Correlations for Multivariate
Time Series Forecasting [18.192600104502628]
Time series data often exhibit diverse intra-series and inter-series correlations.
Extensive experiments are conducted on several real-world datasets to showcase the effectiveness of MSGNet.
arXiv Detail & Related papers (2023-12-31T08:23:24Z) - Entropy Causal Graphs for Multivariate Time Series Anomaly Detection [7.402342914903391]
This work proposes a novel framework called CGAD, an entropy Causal Graph for multivariate time series Anomaly Detection.
CGAD utilizes transfer entropy to construct graph structures that unveil the underlying causal relationships among time series data.
CGAD outperforms state-of-the-art methods on real-world datasets with a 15% average improvement.
arXiv Detail & Related papers (2023-12-15T01:35:00Z) - Compatible Transformer for Irregularly Sampled Multivariate Time Series [75.79309862085303]
We propose a transformer-based encoder to achieve comprehensive temporal-interaction feature learning for each individual sample.
We conduct extensive experiments on 3 real-world datasets and validate that the proposed CoFormer significantly and consistently outperforms existing methods.
arXiv Detail & Related papers (2023-10-17T06:29:09Z) - CARLA: Self-supervised Contrastive Representation Learning for Time Series Anomaly Detection [53.83593870825628]
One main challenge in time series anomaly detection (TSAD) is the lack of labelled data in many real-life scenarios.
Most of the existing anomaly detection methods focus on learning the normal behaviour of unlabelled time series in an unsupervised manner.
We introduce a novel end-to-end self-supervised ContrAstive Representation Learning approach for time series anomaly detection.
arXiv Detail & Related papers (2023-08-18T04:45:56Z) - Robust Detection of Lead-Lag Relationships in Lagged Multi-Factor Models [61.10851158749843]
Key insights can be obtained by discovering lead-lag relationships inherent in the data.
We develop a clustering-driven methodology for robust detection of lead-lag relationships in lagged multi-factor models.
arXiv Detail & Related papers (2023-05-11T10:30:35Z) - DRAformer: Differentially Reconstructed Attention Transformer for
Time-Series Forecasting [7.805077630467324]
Time-series forecasting plays an important role in many real-world scenarios, such as equipment life cycle forecasting, weather forecasting, and traffic flow forecasting.
It can be observed from recent research that a variety of transformer-based models have shown remarkable results in time-series forecasting.
However, there are still some issues that limit the ability of transformer-based models on time-series forecasting tasks.
arXiv Detail & Related papers (2022-06-11T10:34:29Z) - Multi-scale Attention Flow for Probabilistic Time Series Forecasting [68.20798558048678]
We propose a novel non-autoregressive deep learning model, called Multi-scale Attention Normalizing Flow(MANF)
Our model avoids the influence of cumulative error and does not increase the time complexity.
Our model achieves state-of-the-art performance on many popular multivariate datasets.
arXiv Detail & Related papers (2022-05-16T07:53:42Z) - Multi-Time Attention Networks for Irregularly Sampled Time Series [18.224344440110862]
Irregular sampling occurs in many time series modeling applications.
We propose a new deep learning framework for this setting that we call Multi-Time Attention Networks.
Our results show that our approach performs as well or better than a range of baseline and recently proposed models.
arXiv Detail & Related papers (2021-01-25T18:57:42Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.