Sparse Graph Learning from Spatiotemporal Time Series
- URL: http://arxiv.org/abs/2205.13492v3
- Date: Wed, 2 Aug 2023 11:02:52 GMT
- Title: Sparse Graph Learning from Spatiotemporal Time Series
- Authors: Andrea Cini, Daniele Zambon, Cesare Alippi
- Abstract summary: We propose a graph learning framework that learns the relational dependencies as distributions over graphs.
We show that the proposed solution can be used as a stand-alone graph identification procedure as well as a graph learning component of an end-to-end forecasting architecture.
- Score: 16.427698929775023
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Outstanding achievements of graph neural networks for spatiotemporal time
series analysis show that relational constraints introduce an effective
inductive bias into neural forecasting architectures. Often, however, the
relational information characterizing the underlying data-generating process is
unavailable and the practitioner is left with the problem of inferring from
data which relational graph to use in the subsequent processing stages. We
propose novel, principled - yet practical - probabilistic score-based methods
that learn the relational dependencies as distributions over graphs while
maximizing end-to-end the performance at task. The proposed graph learning
framework is based on consolidated variance reduction techniques for Monte
Carlo score-based gradient estimation, is theoretically grounded, and, as we
show, effective in practice. In this paper, we focus on the time series
forecasting problem and show that, by tailoring the gradient estimators to the
graph learning problem, we are able to achieve state-of-the-art performance
while controlling the sparsity of the learned graph and the computational
scalability. We empirically assess the effectiveness of the proposed method on
synthetic and real-world benchmarks, showing that the proposed solution can be
used as a stand-alone graph identification procedure as well as a graph
learning component of an end-to-end forecasting architecture.
Related papers
- Learning Latent Graph Structures and their Uncertainty [63.95971478893842]
Graph Neural Networks (GNNs) use relational information as an inductive bias to enhance the model's accuracy.
As task-relevant relations might be unknown, graph structure learning approaches have been proposed to learn them while solving the downstream prediction task.
arXiv Detail & Related papers (2024-05-30T10:49:22Z) - Graph Deep Learning for Time Series Forecasting [28.30604130617646]
Graph-based deep learning methods have become popular tools to process collections of correlated time series.
This paper aims to introduce a comprehensive methodological framework that formalizes the forecasting problem and provides design principles for graph-based predictive models and methods to assess their performance.
arXiv Detail & Related papers (2023-10-24T16:26:38Z) - Bures-Wasserstein Means of Graphs [60.42414991820453]
We propose a novel framework for defining a graph mean via embeddings in the space of smooth graph signal distributions.
By finding a mean in this embedding space, we can recover a mean graph that preserves structural information.
We establish the existence and uniqueness of the novel graph mean, and provide an iterative algorithm for computing it.
arXiv Detail & Related papers (2023-05-31T11:04:53Z) - A Complex Network based Graph Embedding Method for Link Prediction [0.0]
We present a novel graph embedding approach based on the popularity-similarity and local attraction paradigms.
We show, using extensive experimental analysis, that the proposed method outperforms state-of-the-art graph embedding algorithms.
arXiv Detail & Related papers (2022-09-11T14:46:38Z) - Robust Causal Graph Representation Learning against Confounding Effects [21.380907101361643]
We propose Robust Causal Graph Representation Learning (RCGRL) to learn robust graph representations against confounding effects.
RCGRL introduces an active approach to generate instrumental variables under unconditional moment restrictions, which empowers the graph representation learning model to eliminate confounders.
arXiv Detail & Related papers (2022-08-18T01:31:25Z) - Optimal Propagation for Graph Neural Networks [51.08426265813481]
We propose a bi-level optimization approach for learning the optimal graph structure.
We also explore a low-rank approximation model for further reducing the time complexity.
arXiv Detail & Related papers (2022-05-06T03:37:00Z) - Bayesian Graph Contrastive Learning [55.36652660268726]
We propose a novel perspective of graph contrastive learning methods showing random augmentations leads to encoders.
Our proposed method represents each node by a distribution in the latent space in contrast to existing techniques which embed each node to a deterministic vector.
We show a considerable improvement in performance compared to existing state-of-the-art methods on several benchmark datasets.
arXiv Detail & Related papers (2021-12-15T01:45:32Z) - Multilayer Clustered Graph Learning [66.94201299553336]
We use contrastive loss as a data fidelity term, in order to properly aggregate the observed layers into a representative graph.
Experiments show that our method leads to a clustered clusters w.r.t.
We learn a clustering algorithm for solving clustering problems.
arXiv Detail & Related papers (2020-10-29T09:58:02Z) - Learning an Interpretable Graph Structure in Multi-Task Learning [18.293397644865454]
We present a novel methodology to jointly perform multi-task learning and infer intrinsic relationship among tasks by an interpretable and sparse graph.
Our graph is learned simultaneously with model parameters of each task, thus it reflects the critical relationship among tasks in the specific prediction problem.
arXiv Detail & Related papers (2020-09-11T18:58:14Z) - Learned Factor Graphs for Inference from Stationary Time Sequences [107.63351413549992]
We propose a framework that combines model-based algorithms and data-driven ML tools for stationary time sequences.
neural networks are developed to separately learn specific components of a factor graph describing the distribution of the time sequence.
We present an inference algorithm based on learned stationary factor graphs, which learns to implement the sum-product scheme from labeled data.
arXiv Detail & Related papers (2020-06-05T07:06:19Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.