Graph-Free Learning in Graph-Structured Data: A More Efficient and
Accurate Spatiotemporal Learning Perspective
- URL: http://arxiv.org/abs/2301.11742v2
- Date: Mon, 30 Jan 2023 01:34:09 GMT
- Title: Graph-Free Learning in Graph-Structured Data: A More Efficient and
Accurate Spatiotemporal Learning Perspective
- Authors: Xu Wang, Pengfei Gu, Pengkun Wang, Binwu Wang, Zhengyang Zhou, Lei
Bai, Yang Wang
- Abstract summary: This paper proposes a Graph-Free (SGF) learning module on normalization for capturing spatial correlations in graphtemporal learning.
Rigorous theoretical proof demonstrates that the time complexity is significantly better than that proposed graph convolution operation.
- Score: 11.301939428860404
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Spatiotemporal learning, which aims at extracting spatiotemporal correlations
from the collected spatiotemporal data, is a research hotspot in recent years.
And considering the inherent graph structure of spatiotemporal data, recent
works focus on capturing spatial dependencies by utilizing Graph Convolutional
Networks (GCNs) to aggregate vertex features with the guidance of adjacency
matrices. In this paper, with extensive and deep-going experiments, we
comprehensively analyze existing spatiotemporal graph learning models and
reveal that extracting adjacency matrices with carefully design strategies,
which are viewed as the key of enhancing performance on graph learning, are
largely ineffective. Meanwhile, based on these experiments, we also discover
that the aggregation itself is more important than the way that how vertices
are aggregated. With these preliminary, a novel efficient Graph-Free Spatial
(GFS) learning module based on layer normalization for capturing spatial
correlations in spatiotemporal graph learning. The proposed GFS module can be
easily plugged into existing models for replacing all graph convolution
components. Rigorous theoretical proof demonstrates that the time complexity of
GFS is significantly better than that of graph convolution operation. Extensive
experiments verify the superiority of GFS in both the perspectives of
efficiency and learning effect in processing graph-structured data especially
extreme large scale graph data.
Related papers
- Learning From Graph-Structured Data: Addressing Design Issues and Exploring Practical Applications in Graph Representation Learning [2.492884361833709]
We present an exhaustive review of the latest advancements in graph representation learning and Graph Neural Networks (GNNs)
GNNs, tailored to handle graph-structured data, excel in deriving insights and predictions from intricate relational information.
Our work delves into the capabilities of GNNs, examining their foundational designs and their application in addressing real-world challenges.
arXiv Detail & Related papers (2024-11-09T19:10:33Z) - Amplify Graph Learning for Recommendation via Sparsity Completion [16.32861024767423]
Graph learning models have been widely deployed in collaborative filtering (CF) based recommendation systems.
Due to the issue of data sparsity, the graph structure of the original input lacks potential positive preference edges.
We propose an Amplify Graph Learning framework based on Sparsity Completion (called AGL-SC)
arXiv Detail & Related papers (2024-06-27T08:26:20Z) - Time-aware Graph Structure Learning via Sequence Prediction on Temporal
Graphs [10.034072706245544]
We propose a Time-aware Graph Structure Learning (TGSL) approach via sequence prediction on temporal graphs.
In particular, it predicts time-aware context embedding and uses the Gumble-Top-K to select the closest candidate edges to this context embedding.
Experiments on temporal link prediction benchmarks demonstrate that TGSL yields significant gains for the popular TGNs such as TGAT and GraphMixer.
arXiv Detail & Related papers (2023-06-13T11:34:36Z) - Deep Temporal Graph Clustering [77.02070768950145]
We propose a general framework for deep Temporal Graph Clustering (GC)
GC introduces deep clustering techniques to suit the interaction sequence-based batch-processing pattern of temporal graphs.
Our framework can effectively improve the performance of existing temporal graph learning methods.
arXiv Detail & Related papers (2023-05-18T06:17:50Z) - A Comprehensive Survey on Graph Summarization with Graph Neural Networks [21.337505372979066]
In the past, most graph summarization techniques sought to capture the most important part of a graph statistically.
Today, the high dimensionality and complexity of modern graph data are making deep learning techniques more popular.
Our investigation includes a review of the current state-of-the-art approaches, including recurrent GNNs, convolutional GNNs, graph autoencoders, and graph attention networks.
arXiv Detail & Related papers (2023-02-13T05:43:24Z) - Data Augmentation for Deep Graph Learning: A Survey [66.04015540536027]
We first propose a taxonomy for graph data augmentation and then provide a structured review by categorizing the related work based on the augmented information modalities.
Focusing on the two challenging problems in DGL (i.e., optimal graph learning and low-resource graph learning), we also discuss and review the existing learning paradigms which are based on graph data augmentation.
arXiv Detail & Related papers (2022-02-16T18:30:33Z) - Learning Sparse and Continuous Graph Structures for Multivariate Time
Series Forecasting [5.359968374560132]
Learning Sparse and Continuous Graphs for Forecasting (LSCGF) is a novel deep learning model that joins graph learning and forecasting.
In this paper, we propose a brand new method named Smooth Sparse Unit (SSU) to learn sparse and continuous graph adjacency matrix.
Our model achieves state-of-the-art performances with minor trainable parameters.
arXiv Detail & Related papers (2022-01-24T13:35:37Z) - Towards Unsupervised Deep Graph Structure Learning [67.58720734177325]
We propose an unsupervised graph structure learning paradigm, where the learned graph topology is optimized by data itself without any external guidance.
Specifically, we generate a learning target from the original data as an "anchor graph", and use a contrastive loss to maximize the agreement between the anchor graph and the learned graph.
arXiv Detail & Related papers (2022-01-17T11:57:29Z) - Effective and Efficient Graph Learning for Multi-view Clustering [173.8313827799077]
We propose an effective and efficient graph learning model for multi-view clustering.
Our method exploits the view-similar between graphs of different views by the minimization of tensor Schatten p-norm.
Our proposed algorithm is time-economical and obtains the stable results and scales well with the data size.
arXiv Detail & Related papers (2021-08-15T13:14:28Z) - Robust Optimization as Data Augmentation for Large-scale Graphs [117.2376815614148]
We propose FLAG (Free Large-scale Adversarial Augmentation on Graphs), which iteratively augments node features with gradient-based adversarial perturbations during training.
FLAG is a general-purpose approach for graph data, which universally works in node classification, link prediction, and graph classification tasks.
arXiv Detail & Related papers (2020-10-19T21:51:47Z) - Iterative Deep Graph Learning for Graph Neural Networks: Better and
Robust Node Embeddings [53.58077686470096]
We propose an end-to-end graph learning framework, namely Iterative Deep Graph Learning (IDGL) for jointly and iteratively learning graph structure and graph embedding.
Our experiments show that our proposed IDGL models can consistently outperform or match the state-of-the-art baselines.
arXiv Detail & Related papers (2020-06-21T19:49:15Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.