Graph-Free Learning in Graph-Structured Data: A More Efficient and
Accurate Spatiotemporal Learning Perspective
- URL: http://arxiv.org/abs/2301.11742v2
- Date: Mon, 30 Jan 2023 01:34:09 GMT
- Title: Graph-Free Learning in Graph-Structured Data: A More Efficient and
Accurate Spatiotemporal Learning Perspective
- Authors: Xu Wang, Pengfei Gu, Pengkun Wang, Binwu Wang, Zhengyang Zhou, Lei
Bai, Yang Wang
- Abstract summary: This paper proposes a Graph-Free (SGF) learning module on normalization for capturing spatial correlations in graphtemporal learning.
Rigorous theoretical proof demonstrates that the time complexity is significantly better than that proposed graph convolution operation.
- Score: 11.301939428860404
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Spatiotemporal learning, which aims at extracting spatiotemporal correlations
from the collected spatiotemporal data, is a research hotspot in recent years.
And considering the inherent graph structure of spatiotemporal data, recent
works focus on capturing spatial dependencies by utilizing Graph Convolutional
Networks (GCNs) to aggregate vertex features with the guidance of adjacency
matrices. In this paper, with extensive and deep-going experiments, we
comprehensively analyze existing spatiotemporal graph learning models and
reveal that extracting adjacency matrices with carefully design strategies,
which are viewed as the key of enhancing performance on graph learning, are
largely ineffective. Meanwhile, based on these experiments, we also discover
that the aggregation itself is more important than the way that how vertices
are aggregated. With these preliminary, a novel efficient Graph-Free Spatial
(GFS) learning module based on layer normalization for capturing spatial
correlations in spatiotemporal graph learning. The proposed GFS module can be
easily plugged into existing models for replacing all graph convolution
components. Rigorous theoretical proof demonstrates that the time complexity of
GFS is significantly better than that of graph convolution operation. Extensive
experiments verify the superiority of GFS in both the perspectives of
efficiency and learning effect in processing graph-structured data especially
extreme large scale graph data.
Related papers
- Time-aware Graph Structure Learning via Sequence Prediction on Temporal
Graphs [10.034072706245544]
We propose a Time-aware Graph Structure Learning (TGSL) approach via sequence prediction on temporal graphs.
In particular, it predicts time-aware context embedding and uses the Gumble-Top-K to select the closest candidate edges to this context embedding.
Experiments on temporal link prediction benchmarks demonstrate that TGSL yields significant gains for the popular TGNs such as TGAT and GraphMixer.
arXiv Detail & Related papers (2023-06-13T11:34:36Z) - Deep Temporal Graph Clustering [77.02070768950145]
We propose a general framework for deep Temporal Graph Clustering (GC)
GC introduces deep clustering techniques to suit the interaction sequence-based batch-processing pattern of temporal graphs.
Our framework can effectively improve the performance of existing temporal graph learning methods.
arXiv Detail & Related papers (2023-05-18T06:17:50Z) - A Comprehensive Survey on Graph Summarization with Graph Neural Networks [21.337505372979066]
In the past, most graph summarization techniques sought to capture the most important part of a graph statistically.
Today, the high dimensionality and complexity of modern graph data are making deep learning techniques more popular.
Our investigation includes a review of the current state-of-the-art approaches, including recurrent GNNs, convolutional GNNs, graph autoencoders, and graph attention networks.
arXiv Detail & Related papers (2023-02-13T05:43:24Z) - Data Augmentation for Deep Graph Learning: A Survey [66.04015540536027]
We first propose a taxonomy for graph data augmentation and then provide a structured review by categorizing the related work based on the augmented information modalities.
Focusing on the two challenging problems in DGL (i.e., optimal graph learning and low-resource graph learning), we also discuss and review the existing learning paradigms which are based on graph data augmentation.
arXiv Detail & Related papers (2022-02-16T18:30:33Z) - Learning Sparse and Continuous Graph Structures for Multivariate Time
Series Forecasting [5.359968374560132]
Learning Sparse and Continuous Graphs for Forecasting (LSCGF) is a novel deep learning model that joins graph learning and forecasting.
In this paper, we propose a brand new method named Smooth Sparse Unit (SSU) to learn sparse and continuous graph adjacency matrix.
Our model achieves state-of-the-art performances with minor trainable parameters.
arXiv Detail & Related papers (2022-01-24T13:35:37Z) - Towards Unsupervised Deep Graph Structure Learning [67.58720734177325]
We propose an unsupervised graph structure learning paradigm, where the learned graph topology is optimized by data itself without any external guidance.
Specifically, we generate a learning target from the original data as an "anchor graph", and use a contrastive loss to maximize the agreement between the anchor graph and the learned graph.
arXiv Detail & Related papers (2022-01-17T11:57:29Z) - Effective and Efficient Graph Learning for Multi-view Clustering [173.8313827799077]
We propose an effective and efficient graph learning model for multi-view clustering.
Our method exploits the view-similar between graphs of different views by the minimization of tensor Schatten p-norm.
Our proposed algorithm is time-economical and obtains the stable results and scales well with the data size.
arXiv Detail & Related papers (2021-08-15T13:14:28Z) - Hierarchical Adaptive Pooling by Capturing High-order Dependency for
Graph Representation Learning [18.423192209359158]
Graph neural networks (GNN) have been proven to be mature enough for handling graph-structured data on node-level graph representation learning tasks.
This paper proposes a hierarchical graph-level representation learning framework, which is adaptively sensitive to graph structures.
arXiv Detail & Related papers (2021-04-13T06:22:24Z) - Graph-Time Convolutional Neural Networks [9.137554315375919]
We represent spatial relationships through product graphs with a first principle graph-time convolutional neural network (GTCNN)
We develop a graph-time convolutional filter by following the shift-and-sumtemporal operator to learn higher-level features over the product graph.
We develop a zero-pad pooling that preserves the spatial graph while reducing the number of active nodes and the parameters.
arXiv Detail & Related papers (2021-03-02T14:03:44Z) - Robust Optimization as Data Augmentation for Large-scale Graphs [117.2376815614148]
We propose FLAG (Free Large-scale Adversarial Augmentation on Graphs), which iteratively augments node features with gradient-based adversarial perturbations during training.
FLAG is a general-purpose approach for graph data, which universally works in node classification, link prediction, and graph classification tasks.
arXiv Detail & Related papers (2020-10-19T21:51:47Z) - Iterative Deep Graph Learning for Graph Neural Networks: Better and
Robust Node Embeddings [53.58077686470096]
We propose an end-to-end graph learning framework, namely Iterative Deep Graph Learning (IDGL) for jointly and iteratively learning graph structure and graph embedding.
Our experiments show that our proposed IDGL models can consistently outperform or match the state-of-the-art baselines.
arXiv Detail & Related papers (2020-06-21T19:49:15Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.