Inductive Graph Neural Networks for Spatiotemporal Kriging
- URL: http://arxiv.org/abs/2006.07527v2
- Date: Sat, 19 Dec 2020 16:09:16 GMT
- Title: Inductive Graph Neural Networks for Spatiotemporal Kriging
- Authors: Yuankai Wu, Dingyi Zhuang, Aurelie Labbe and Lijun Sun
- Abstract summary: We develop an inductive graph neural network model to recover data for unsampled sensors on a network/graph structure.
Empirical results on several real-worldtemporal datasets demonstrate the effectiveness of our model.
- Score: 13.666589510218738
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Time series forecasting and spatiotemporal kriging are the two most important
tasks in spatiotemporal data analysis. Recent research on graph neural networks
has made substantial progress in time series forecasting, while little
attention has been paid to the kriging problem -- recovering signals for
unsampled locations/sensors. Most existing scalable kriging methods (e.g.,
matrix/tensor completion) are transductive, and thus full retraining is
required when we have a new sensor to interpolate. In this paper, we develop an
Inductive Graph Neural Network Kriging (IGNNK) model to recover data for
unsampled sensors on a network/graph structure. To generalize the effect of
distance and reachability, we generate random subgraphs as samples and
reconstruct the corresponding adjacency matrix for each sample. By
reconstructing all signals on each sample subgraph, IGNNK can effectively learn
the spatial message passing mechanism. Empirical results on several real-world
spatiotemporal datasets demonstrate the effectiveness of our model. In
addition, we also find that the learned model can be successfully transferred
to the same type of kriging tasks on an unseen dataset. Our results show that:
1) GNN is an efficient and effective tool for spatial kriging; 2) inductive
GNNs can be trained using dynamic adjacency matrices; 3) a trained model can be
transferred to new graph structures and 4) IGNNK can be used to generate
virtual sensors.
Related papers
- Joint Edge-Model Sparse Learning is Provably Efficient for Graph Neural
Networks [89.28881869440433]
This paper provides the first theoretical characterization of joint edge-model sparse learning for graph neural networks (GNNs)
It proves analytically that both sampling important nodes and pruning neurons with the lowest-magnitude can reduce the sample complexity and improve convergence without compromising the test accuracy.
arXiv Detail & Related papers (2023-02-06T16:54:20Z) - Energy-based Out-of-Distribution Detection for Graph Neural Networks [76.0242218180483]
We propose a simple, powerful and efficient OOD detection model for GNN-based learning on graphs, which we call GNNSafe.
GNNSafe achieves up to $17.0%$ AUROC improvement over state-of-the-arts and it could serve as simple yet strong baselines in such an under-developed area.
arXiv Detail & Related papers (2023-02-06T16:38:43Z) - Graph Neural Networks with Trainable Adjacency Matrices for Fault
Diagnosis on Multivariate Sensor Data [69.25738064847175]
It is necessary to consider the behavior of the signals in each sensor separately, to take into account their correlation and hidden relationships with each other.
The graph nodes can be represented as data from the different sensors, and the edges can display the influence of these data on each other.
It was proposed to construct a graph during the training of graph neural network. This allows to train models on data where the dependencies between the sensors are not known in advance.
arXiv Detail & Related papers (2022-10-20T11:03:21Z) - Invertible Neural Networks for Graph Prediction [22.140275054568985]
In this work, we address conditional generation using deep invertible neural networks.
We adopt an end-to-end training approach since our objective is to address prediction and generation in the forward and backward processes at once.
arXiv Detail & Related papers (2022-06-02T17:28:33Z) - Learning to Reconstruct Missing Data from Spatiotemporal Graphs with
Sparse Observations [11.486068333583216]
This paper tackles the problem of learning effective models to reconstruct missing data points.
We propose a class of attention-based architectures, that given a set of highly sparse observations, learn a representation for points in time and space.
Compared to the state of the art, our model handles sparse data without propagating prediction errors or requiring a bidirectional model to encode forward and backward time dependencies.
arXiv Detail & Related papers (2022-05-26T16:40:48Z) - Pretraining Graph Neural Networks for few-shot Analog Circuit Modeling
and Design [68.1682448368636]
We present a supervised pretraining approach to learn circuit representations that can be adapted to new unseen topologies or unseen prediction tasks.
To cope with the variable topological structure of different circuits we describe each circuit as a graph and use graph neural networks (GNNs) to learn node embeddings.
We show that pretraining GNNs on prediction of output node voltages can encourage learning representations that can be adapted to new unseen topologies or prediction of new circuit level properties.
arXiv Detail & Related papers (2022-03-29T21:18:47Z) - Spatial Aggregation and Temporal Convolution Networks for Real-time
Kriging [3.4386226615580107]
SATCN is a universal and flexible framework to performtemporal kriging for various datasets without need for model specification.
We capture nodes by temporal convolutional networks, which allows our model to cope with data of diverse sizes.
We conduct extensive experiments on three real-world datasets, including traffic and climate recordings.
arXiv Detail & Related papers (2021-09-24T18:43:07Z) - Online Graph Topology Learning from Matrix-valued Time Series [0.0]
The focus is on the statistical analysis of matrix-valued time series, where data is collected over a network of sensors.
The goal is to identify the dependency structure among these sensors and represent it with a graph.
Online algorithms are adapted to these augmented data models, allowing for simultaneous learning of the graph and trend from streaming samples.
arXiv Detail & Related papers (2021-07-16T17:21:14Z) - PredRNN: A Recurrent Neural Network for Spatiotemporal Predictive
Learning [109.84770951839289]
We present PredRNN, a new recurrent network for learning visual dynamics from historical context.
We show that our approach obtains highly competitive results on three standard datasets.
arXiv Detail & Related papers (2021-03-17T08:28:30Z) - Binary Graph Neural Networks [69.51765073772226]
Graph Neural Networks (GNNs) have emerged as a powerful and flexible framework for representation learning on irregular data.
In this paper, we present and evaluate different strategies for the binarization of graph neural networks.
We show that through careful design of the models, and control of the training process, binary graph neural networks can be trained at only a moderate cost in accuracy on challenging benchmarks.
arXiv Detail & Related papers (2020-12-31T18:48:58Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.