Learning to Reconstruct Missing Data from Spatiotemporal Graphs with
Sparse Observations
- URL: http://arxiv.org/abs/2205.13479v1
- Date: Thu, 26 May 2022 16:40:48 GMT
- Title: Learning to Reconstruct Missing Data from Spatiotemporal Graphs with
Sparse Observations
- Authors: Ivan Marisca, Andrea Cini, Cesare Alippi
- Abstract summary: This paper tackles the problem of learning effective models to reconstruct missing data points.
We propose a class of attention-based architectures, that given a set of highly sparse observations, learn a representation for points in time and space.
Compared to the state of the art, our model handles sparse data without propagating prediction errors or requiring a bidirectional model to encode forward and backward time dependencies.
- Score: 11.486068333583216
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Modeling multivariate time series as temporal signals over a (possibly
dynamic) graph is an effective representational framework that allows for
developing models for time series analysis. In fact, discrete sequences of
graphs can be processed by autoregressive graph neural networks to recursively
learn representations at each discrete point in time and space. Spatiotemporal
graphs are often highly sparse, with time series characterized by multiple,
concurrent, and even long sequences of missing data, e.g., due to the
unreliable underlying sensor network. In this context, autoregressive models
can be brittle and exhibit unstable learning dynamics. The objective of this
paper is, then, to tackle the problem of learning effective models to
reconstruct, i.e., impute, missing data points by conditioning the
reconstruction only on the available observations. In particular, we propose a
novel class of attention-based architectures that, given a set of highly sparse
discrete observations, learn a representation for points in time and space by
exploiting a spatiotemporal diffusion architecture aligned with the imputation
task. Representations are trained end-to-end to reconstruct observations w.r.t.
the corresponding sensor and its neighboring nodes. Compared to the state of
the art, our model handles sparse data without propagating prediction errors or
requiring a bidirectional model to encode forward and backward time
dependencies. Empirical results on representative benchmarks show the
effectiveness of the proposed method.
Related papers
- Multivariate Time-Series Anomaly Detection based on Enhancing Graph Attention Networks with Topological Analysis [31.43159668073136]
Unsupervised anomaly detection in time series is essential in industrial applications, as it significantly reduces the need for manual intervention.
Traditional methods use Graph Neural Networks (GNNs) or Transformers to analyze spatial while RNNs to model temporal dependencies.
This paper introduces a novel temporal model built on an enhanced Graph Attention Network (GAT) for multivariate time series anomaly detection called TopoGDN.
arXiv Detail & Related papers (2024-08-23T14:06:30Z) - Graph-based Forecasting with Missing Data through Spatiotemporal Downsampling [24.368893944128086]
Stemporal graph neural networks achieve striking results by representing relationships across time series as a graph.
Most existing methods rely on the often unrealistic assumption that inputs are always available and fail to capture hidden dynamics when part of the data is missing.
arXiv Detail & Related papers (2024-02-16T12:33:31Z) - Graph Spatiotemporal Process for Multivariate Time Series Anomaly
Detection with Missing Values [67.76168547245237]
We introduce a novel framework called GST-Pro, which utilizes a graphtemporal process and anomaly scorer to detect anomalies.
Our experimental results show that the GST-Pro method can effectively detect anomalies in time series data and outperforms state-of-the-art methods.
arXiv Detail & Related papers (2024-01-11T10:10:16Z) - Networked Time Series Imputation via Position-aware Graph Enhanced
Variational Autoencoders [31.953958053709805]
We design a new model named PoGeVon which leverages variational autoencoder (VAE) to predict missing values over both node time series features and graph structures.
Experiment results demonstrate the effectiveness of our model over baselines.
arXiv Detail & Related papers (2023-05-29T21:11:34Z) - ChiroDiff: Modelling chirographic data with Diffusion Models [132.5223191478268]
We introduce a powerful model-class namely "Denoising Diffusion Probabilistic Models" or DDPMs for chirographic data.
Our model named "ChiroDiff", being non-autoregressive, learns to capture holistic concepts and therefore remains resilient to higher temporal sampling rate.
arXiv Detail & Related papers (2023-04-07T15:17:48Z) - Temporal Graph Neural Networks for Irregular Data [14.653008985229615]
TGNN4I model is designed to handle both irregular time steps and partial observations of the graph.
Time-continuous dynamics enables the model to make predictions at arbitrary time steps.
Experiments on simulated data and real-world data from traffic and climate modeling validate the usefulness of both the graph structure and time-continuous dynamics.
arXiv Detail & Related papers (2023-02-16T16:47:55Z) - Multivariate Time Series Forecasting with Dynamic Graph Neural ODEs [65.18780403244178]
We propose a continuous model to forecast Multivariate Time series with dynamic Graph neural Ordinary Differential Equations (MTGODE)
Specifically, we first abstract multivariate time series into dynamic graphs with time-evolving node features and unknown graph structures.
Then, we design and solve a neural ODE to complement missing graph topologies and unify both spatial and temporal message passing.
arXiv Detail & Related papers (2022-02-17T02:17:31Z) - Multivariate Time Series Imputation by Graph Neural Networks [13.308026049048717]
We introduce a graph neural network architecture, named GRIL, which aims at reconstructing missing data in different channels of a multivariate time series.
Preliminary results show that our model outperforms state-of-the-art methods in the imputation task on relevant benchmarks.
arXiv Detail & Related papers (2021-07-31T17:47:10Z) - Anomaly Detection of Time Series with Smoothness-Inducing Sequential
Variational Auto-Encoder [59.69303945834122]
We present a Smoothness-Inducing Sequential Variational Auto-Encoder (SISVAE) model for robust estimation and anomaly detection of time series.
Our model parameterizes mean and variance for each time-stamp with flexible neural networks.
We show the effectiveness of our model on both synthetic datasets and public real-world benchmarks.
arXiv Detail & Related papers (2021-02-02T06:15:15Z) - Learned Factor Graphs for Inference from Stationary Time Sequences [107.63351413549992]
We propose a framework that combines model-based algorithms and data-driven ML tools for stationary time sequences.
neural networks are developed to separately learn specific components of a factor graph describing the distribution of the time sequence.
We present an inference algorithm based on learned stationary factor graphs, which learns to implement the sum-product scheme from labeled data.
arXiv Detail & Related papers (2020-06-05T07:06:19Z) - Connecting the Dots: Multivariate Time Series Forecasting with Graph
Neural Networks [91.65637773358347]
We propose a general graph neural network framework designed specifically for multivariate time series data.
Our approach automatically extracts the uni-directed relations among variables through a graph learning module.
Our proposed model outperforms the state-of-the-art baseline methods on 3 of 4 benchmark datasets.
arXiv Detail & Related papers (2020-05-24T04:02:18Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.