Spatiotemporal Covariance Neural Networks
- URL: http://arxiv.org/abs/2409.10068v1
- Date: Mon, 16 Sep 2024 08:05:58 GMT
- Title: Spatiotemporal Covariance Neural Networks
- Authors: Andrea Cavallo, Mohammad Sabbaqi, Elvin Isufi,
- Abstract summary: We introduce a relational learning model that operates on the sample covariance matrix of time series and leverages jointtemporal convolutions to model the data.
We prove the STVNN is stable and improves over temporal PCA-based methods.
- Score: 10.855602842179621
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Modeling spatiotemporal interactions in multivariate time series is key to their effective processing, but challenging because of their irregular and often unknown structure. Statistical properties of the data provide useful biases to model interdependencies and are leveraged by correlation and covariance-based networks as well as by processing pipelines relying on principal component analysis (PCA). However, PCA and its temporal extensions suffer instabilities in the covariance eigenvectors when the corresponding eigenvalues are close to each other, making their application to dynamic and streaming data settings challenging. To address these issues, we exploit the analogy between PCA and graph convolutional filters to introduce the SpatioTemporal coVariance Neural Network (STVNN), a relational learning model that operates on the sample covariance matrix of the time series and leverages joint spatiotemporal convolutions to model the data. To account for the streaming and non-stationary setting, we consider an online update of the parameters and sample covariance matrix. We prove the STVNN is stable to the uncertainties introduced by these online estimations, thus improving over temporal PCA-based methods. Experimental results corroborate our theoretical findings and show that STVNN is competitive for multivariate time series processing, it adapts to changes in the data distribution, and it is orders of magnitude more stable than online temporal PCA.
Related papers
- Online Variational Sequential Monte Carlo [49.97673761305336]
We build upon the variational sequential Monte Carlo (VSMC) method, which provides computationally efficient and accurate model parameter estimation and Bayesian latent-state inference.
Online VSMC is capable of performing efficiently, entirely on-the-fly, both parameter estimation and particle proposal adaptation.
arXiv Detail & Related papers (2023-12-19T21:45:38Z) - Time-Parameterized Convolutional Neural Networks for Irregularly Sampled
Time Series [26.77596449192451]
Irregularly sampled time series are ubiquitous in several application domains, leading to sparse, not fully-observed and non-aligned observations.
Standard sequential neural networks (RNNs) and convolutional neural networks (CNNs) consider regular spacing between observation times, posing significant challenges to irregular time series modeling.
We parameterize convolutional layers by employing time-explicitly irregular kernels.
arXiv Detail & Related papers (2023-08-06T21:10:30Z) - Correlation-aware Spatial-Temporal Graph Learning for Multivariate
Time-series Anomaly Detection [67.60791405198063]
We propose a correlation-aware spatial-temporal graph learning (termed CST-GL) for time series anomaly detection.
CST-GL explicitly captures the pairwise correlations via a multivariate time series correlation learning module.
A novel anomaly scoring component is further integrated into CST-GL to estimate the degree of an anomaly in a purely unsupervised manner.
arXiv Detail & Related papers (2023-07-17T11:04:27Z) - Network inference via process motifs for lagged correlation in linear
stochastic processes [0.5801044612920815]
A major challenge for causal inference from time-series data is the trade-off between computational feasibility and accuracy.
We propose to infer networks of causal relations via pairwise edge measure (PEMs) that one can easily compute from lagged correlation matrices.
arXiv Detail & Related papers (2022-08-18T14:46:05Z) - Enhancing the Robustness via Adversarial Learning and Joint
Spatial-Temporal Embeddings in Traffic Forecasting [11.680589359294972]
We propose TrendGCN to address the challenge of balancing dynamics and robustness.
Our model simultaneously incorporates spatial (node-wise) embeddings and temporal (time-wise) embeddings to account for heterogeneous space-and-time convolutions.
Compared with traditional approaches that handle step-wise predictive errors independently, our approach can produce more realistic and robust forecasts.
arXiv Detail & Related papers (2022-08-05T09:36:55Z) - coVariance Neural Networks [119.45320143101381]
Graph neural networks (GNN) are an effective framework that exploit inter-relationships within graph-structured data for learning.
We propose a GNN architecture, called coVariance neural network (VNN), that operates on sample covariance matrices as graphs.
We show that VNN performance is indeed more stable than PCA-based statistical approaches.
arXiv Detail & Related papers (2022-05-31T15:04:43Z) - Multi-scale Attention Flow for Probabilistic Time Series Forecasting [68.20798558048678]
We propose a novel non-autoregressive deep learning model, called Multi-scale Attention Normalizing Flow(MANF)
Our model avoids the influence of cumulative error and does not increase the time complexity.
Our model achieves state-of-the-art performance on many popular multivariate datasets.
arXiv Detail & Related papers (2022-05-16T07:53:42Z) - TACTiS: Transformer-Attentional Copulas for Time Series [76.71406465526454]
estimation of time-varying quantities is a fundamental component of decision making in fields such as healthcare and finance.
We propose a versatile method that estimates joint distributions using an attention-based decoder.
We show that our model produces state-of-the-art predictions on several real-world datasets.
arXiv Detail & Related papers (2022-02-07T21:37:29Z) - Correlation-aware Unsupervised Change-point Detection via Graph Neural
Networks [37.68666658785003]
Change-point detection (CPD) aims to detect abrupt changes over time series data.
Existing CPD methods either ignore the dependency structures entirely or rely on the (unrealistic) assumption that the correlation structures are static over time.
We propose a Correlation-aware Dynamics Model for CPD, which explicitly models the correlation structure and dynamics of variables.
arXiv Detail & Related papers (2020-04-24T18:28:57Z) - Transformer Hawkes Process [79.16290557505211]
We propose a Transformer Hawkes Process (THP) model, which leverages the self-attention mechanism to capture long-term dependencies.
THP outperforms existing models in terms of both likelihood and event prediction accuracy by a notable margin.
We provide a concrete example, where THP achieves improved prediction performance for learning multiple point processes when incorporating their relational information.
arXiv Detail & Related papers (2020-02-21T13:48:13Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.