Compatible Transformer for Irregularly Sampled Multivariate Time Series
- URL: http://arxiv.org/abs/2310.11022v1
- Date: Tue, 17 Oct 2023 06:29:09 GMT
- Title: Compatible Transformer for Irregularly Sampled Multivariate Time Series
- Authors: Yuxi Wei, Juntong Peng, Tong He, Chenxin Xu, Jian Zhang, Shirui Pan,
Siheng Chen
- Abstract summary: We propose a transformer-based encoder to achieve comprehensive temporal-interaction feature learning for each individual sample.
We conduct extensive experiments on 3 real-world datasets and validate that the proposed CoFormer significantly and consistently outperforms existing methods.
- Score: 75.79309862085303
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: To analyze multivariate time series, most previous methods assume regular
subsampling of time series, where the interval between adjacent measurements
and the number of samples remain unchanged. Practically, data collection
systems could produce irregularly sampled time series due to sensor failures
and interventions. However, existing methods designed for regularly sampled
multivariate time series cannot directly handle irregularity owing to
misalignment along both temporal and variate dimensions. To fill this gap, we
propose Compatible Transformer (CoFormer), a transformer-based encoder to
achieve comprehensive temporal-interaction feature learning for each individual
sample in irregular multivariate time series. In CoFormer, we view each sample
as a unique variate-time point and leverage intra-variate/inter-variate
attentions to learn sample-wise temporal/interaction features based on
intra-variate/inter-variate neighbors. With CoFormer as the core, we can
analyze irregularly sampled multivariate time series for many downstream tasks,
including classification and prediction. We conduct extensive experiments on 3
real-world datasets and validate that the proposed CoFormer significantly and
consistently outperforms existing methods.
Related papers
- Timer-XL: Long-Context Transformers for Unified Time Series Forecasting [67.83502953961505]
We present Timer-XL, a generative Transformer for unified time series forecasting.
Timer-XL achieves state-of-the-art performance across challenging forecasting benchmarks through a unified approach.
arXiv Detail & Related papers (2024-10-07T07:27:39Z) - Causal Discovery-Driven Change Point Detection in Time Series [32.424281626708336]
Change point detection in time series seeks to identify times when the probability distribution of time series changes.
In practical applications, we may be interested only in certain components of the time series, exploring abrupt changes in their distributions.
arXiv Detail & Related papers (2024-07-10T00:54:42Z) - Graph Spatiotemporal Process for Multivariate Time Series Anomaly
Detection with Missing Values [67.76168547245237]
We introduce a novel framework called GST-Pro, which utilizes a graphtemporal process and anomaly scorer to detect anomalies.
Our experimental results show that the GST-Pro method can effectively detect anomalies in time series data and outperforms state-of-the-art methods.
arXiv Detail & Related papers (2024-01-11T10:10:16Z) - EdgeConvFormer: Dynamic Graph CNN and Transformer based Anomaly
Detection in Multivariate Time Series [7.514010315664322]
We propose a novel anomaly detection method, named EdgeConvFormer, which integrates stacked Time2vec embedding, dynamic graph CNN, and Transformer to extract global and local spatial-time information.
Experiments demonstrate that EdgeConvFormer can learn the spatial-temporal modeling from multivariate time series data and achieve better anomaly detection performance than the state-of-the-art approaches on many real-world datasets of different scales.
arXiv Detail & Related papers (2023-12-04T08:38:54Z) - Warpformer: A Multi-scale Modeling Approach for Irregular Clinical Time
Series [29.838484652943976]
Intra-series irregularity and inter-series discrepancy are key characteristics of irregular time series.
We present Warpformer, a novel approach that fully considers these two characteristics.
We conduct extensive experiments on widely used datasets and a new large-scale benchmark built from clinical databases.
arXiv Detail & Related papers (2023-06-14T13:23:14Z) - MPPN: Multi-Resolution Periodic Pattern Network For Long-Term Time
Series Forecasting [19.573651104129443]
Long-term time series forecasting plays an important role in various real-world scenarios.
Recent deep learning methods for long-term series forecasting tend to capture the intricate patterns of time series by decomposition-based or sampling-based methods.
We propose a novel deep learning network architecture, named Multi-resolution Periodic Pattern Network (MPPN), for long-term series forecasting.
arXiv Detail & Related papers (2023-06-12T07:00:37Z) - Robust Detection of Lead-Lag Relationships in Lagged Multi-Factor Models [61.10851158749843]
Key insights can be obtained by discovering lead-lag relationships inherent in the data.
We develop a clustering-driven methodology for robust detection of lead-lag relationships in lagged multi-factor models.
arXiv Detail & Related papers (2023-05-11T10:30:35Z) - Multi-scale Attention Flow for Probabilistic Time Series Forecasting [68.20798558048678]
We propose a novel non-autoregressive deep learning model, called Multi-scale Attention Normalizing Flow(MANF)
Our model avoids the influence of cumulative error and does not increase the time complexity.
Our model achieves state-of-the-art performance on many popular multivariate datasets.
arXiv Detail & Related papers (2022-05-16T07:53:42Z) - TACTiS: Transformer-Attentional Copulas for Time Series [76.71406465526454]
estimation of time-varying quantities is a fundamental component of decision making in fields such as healthcare and finance.
We propose a versatile method that estimates joint distributions using an attention-based decoder.
We show that our model produces state-of-the-art predictions on several real-world datasets.
arXiv Detail & Related papers (2022-02-07T21:37:29Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.