Regularized Graph Structure Learning with Semantic Knowledge for
Multi-variates Time-Series Forecasting
- URL: http://arxiv.org/abs/2210.06126v1
- Date: Wed, 12 Oct 2022 12:38:21 GMT
- Title: Regularized Graph Structure Learning with Semantic Knowledge for
Multi-variates Time-Series Forecasting
- Authors: Hongyuan Yu, Ting Li, Weichen Yu, Jianguo Li, Yan Huang, Liang Wang,
Alex Liu
- Abstract summary: We propose Regularized Graph Structure Learning (RGSL) model to incorporate both explicit prior structure and implicit structure together.
First, we derive an implicit dense similarity matrix through node embedding, and learn the sparse graph structure using the Regularized Graph Generation (RGG) based on the Gumbel Softmax trick.
Second, we propose a Laplacian Matrix Mixed-up Module (LM3) to fuse the explicit graph and implicit graph together.
- Score: 18.18430351021155
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Multivariate time-series forecasting is a critical task for many
applications, and graph time-series network is widely studied due to its
capability to capture the spatial-temporal correlation simultaneously. However,
most existing works focus more on learning with the explicit prior graph
structure, while ignoring potential information from the implicit graph
structure, yielding incomplete structure modeling. Some recent works attempt to
learn the intrinsic or implicit graph structure directly while lacking a way to
combine explicit prior structure with implicit structure together. In this
paper, we propose Regularized Graph Structure Learning (RGSL) model to
incorporate both explicit prior structure and implicit structure together, and
learn the forecasting deep networks along with the graph structure. RGSL
consists of two innovative modules. First, we derive an implicit dense
similarity matrix through node embedding, and learn the sparse graph structure
using the Regularized Graph Generation (RGG) based on the Gumbel Softmax trick.
Second, we propose a Laplacian Matrix Mixed-up Module (LM3) to fuse the
explicit graph and implicit graph together. We conduct experiments on three
real-word datasets. Results show that the proposed RGSL model outperforms
existing graph forecasting algorithms with a notable margin, while learning
meaningful graph structure simultaneously. Our code and models are made
publicly available at https://github.com/alipay/RGSL.git.
Related papers
- GraphLSS: Integrating Lexical, Structural, and Semantic Features for Long Document Extractive Summarization [19.505955857963855]
We present GraphLSS, a heterogeneous graph construction for long document extractive summarization.
It defines two levels of information (words and sentences) and four types of edges (sentence semantic similarity, sentence occurrence order, word in sentence, and word semantic similarity) without any need for auxiliary learning models.
arXiv Detail & Related papers (2024-10-25T23:48:59Z) - GraphEdit: Large Language Models for Graph Structure Learning [62.618818029177355]
Graph Structure Learning (GSL) focuses on capturing intrinsic dependencies and interactions among nodes in graph-structured data.
Existing GSL methods heavily depend on explicit graph structural information as supervision signals.
We propose GraphEdit, an approach that leverages large language models (LLMs) to learn complex node relationships in graph-structured data.
arXiv Detail & Related papers (2024-02-23T08:29:42Z) - Sparsity exploitation via discovering graphical models in multi-variate
time-series forecasting [1.2762298148425795]
We propose a decoupled training method, which includes a graph generating module and a GNNs forecasting module.
First, we use Graphical Lasso (or GraphLASSO) to directly exploit the sparsity pattern from data to build graph structures.
Second, we fit these graph structures and the input data into a Graph Convolutional Recurrent Network (GCRN) to train a forecasting model.
arXiv Detail & Related papers (2023-06-29T16:48:00Z) - GraphGLOW: Universal and Generalizable Structure Learning for Graph
Neural Networks [72.01829954658889]
This paper introduces the mathematical definition of this novel problem setting.
We devise a general framework that coordinates a single graph-shared structure learner and multiple graph-specific GNNs.
The well-trained structure learner can directly produce adaptive structures for unseen target graphs without any fine-tuning.
arXiv Detail & Related papers (2023-06-20T03:33:22Z) - GrannGAN: Graph annotation generative adversarial networks [72.66289932625742]
We consider the problem of modelling high-dimensional distributions and generating new examples of data with complex relational feature structure coherent with a graph skeleton.
The model we propose tackles the problem of generating the data features constrained by the specific graph structure of each data point by splitting the task into two phases.
In the first it models the distribution of features associated with the nodes of the given graph, in the second it complements the edge features conditionally on the node features.
arXiv Detail & Related papers (2022-12-01T11:49:07Z) - Compositionality-Aware Graph2Seq Learning [2.127049691404299]
compositionality in a graph can be associated to the compositionality in the output sequence in many graph2seq tasks.
We adopt the multi-level attention pooling (MLAP) architecture, that can aggregate graph representations from multiple levels of information localities.
We demonstrate that the model having the MLAP architecture outperform the previous state-of-the-art model with more than seven times fewer parameters.
arXiv Detail & Related papers (2022-01-28T15:22:39Z) - Joint Graph Learning and Matching for Semantic Feature Correspondence [69.71998282148762]
We propose a joint emphgraph learning and matching network, named GLAM, to explore reliable graph structures for boosting graph matching.
The proposed method is evaluated on three popular visual matching benchmarks (Pascal VOC, Willow Object and SPair-71k)
It outperforms previous state-of-the-art graph matching methods by significant margins on all benchmarks.
arXiv Detail & Related papers (2021-09-01T08:24:02Z) - Discrete Graph Structure Learning for Forecasting Multiple Time Series [14.459541930646205]
Time series forecasting is an extensively studied subject in statistics, economics, and computer science.
In this work, we propose learning the structure simultaneously with a graph neural network (GNN) if the graph is unknown.
Empirical evaluations show that our method is simpler, more efficient, and better performing than a recently proposed bilevel learning approach for graph structure learning.
arXiv Detail & Related papers (2021-01-18T03:36:33Z) - Improving Graph Neural Network Expressivity via Subgraph Isomorphism
Counting [63.04999833264299]
"Graph Substructure Networks" (GSN) is a topologically-aware message passing scheme based on substructure encoding.
We show that it is strictly more expressive than the Weisfeiler-Leman (WL) graph isomorphism test.
We perform an extensive evaluation on graph classification and regression tasks and obtain state-of-the-art results in diverse real-world settings.
arXiv Detail & Related papers (2020-06-16T15:30:31Z) - Connecting the Dots: Multivariate Time Series Forecasting with Graph
Neural Networks [91.65637773358347]
We propose a general graph neural network framework designed specifically for multivariate time series data.
Our approach automatically extracts the uni-directed relations among variables through a graph learning module.
Our proposed model outperforms the state-of-the-art baseline methods on 3 of 4 benchmark datasets.
arXiv Detail & Related papers (2020-05-24T04:02:18Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.