Framelet Message Passing
- URL: http://arxiv.org/abs/2302.14806v1
- Date: Tue, 28 Feb 2023 17:56:19 GMT
- Title: Framelet Message Passing
- Authors: Xinliang Liu, Bingxin Zhou, Chutian Zhang, Yu Guang Wang
- Abstract summary: We propose a new message passing based on multiscale framelet transforms, called Framelet Message Passing.
It integrates framelet representation of neighbor nodes from multiple hops away in node message update.
We also propose a continuous message passing using neural ODE solvers.
- Score: 2.479720095773358
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Graph neural networks (GNNs) have achieved champion in wide applications.
Neural message passing is a typical key module for feature propagation by
aggregating neighboring features. In this work, we propose a new message
passing based on multiscale framelet transforms, called Framelet Message
Passing. Different from traditional spatial methods, it integrates framelet
representation of neighbor nodes from multiple hops away in node message
update. We also propose a continuous message passing using neural ODE solvers.
It turns both discrete and continuous cases can provably achieve network
stability and limit oversmoothing due to the multiscale property of framelets.
Numerical experiments on real graph datasets show that the continuous version
of the framelet message passing significantly outperforms existing methods when
learning heterogeneous graphs and achieves state-of-the-art performance on
classic node classification tasks with low computational costs.
Related papers
- Towards Dynamic Message Passing on Graphs [104.06474765596687]
We propose a novel dynamic message-passing mechanism for graph neural networks (GNNs)
It projects graph nodes and learnable pseudo nodes into a common space with measurable spatial relations between them.
With nodes moving in the space, their evolving relations facilitate flexible pathway construction for a dynamic message-passing process.
arXiv Detail & Related papers (2024-10-31T07:20:40Z) - A Pure Transformer Pretraining Framework on Text-attributed Graphs [50.833130854272774]
We introduce a feature-centric pretraining perspective by treating graph structure as a prior.
Our framework, Graph Sequence Pretraining with Transformer (GSPT), samples node contexts through random walks.
GSPT can be easily adapted to both node classification and link prediction, demonstrating promising empirical success on various datasets.
arXiv Detail & Related papers (2024-06-19T22:30:08Z) - Multi-Scene Generalized Trajectory Global Graph Solver with Composite
Nodes for Multiple Object Tracking [61.69892497726235]
Composite Node Message Passing Network (CoNo-Link) is a framework for modeling ultra-long frames information for association.
In addition to the previous method of treating objects as nodes, the network innovatively treats object trajectories as nodes for information interaction.
Our model can learn better predictions on longer-time scales by adding composite nodes.
arXiv Detail & Related papers (2023-12-14T14:00:30Z) - Half-Hop: A graph upsampling approach for slowing down message passing [31.26080679115766]
We introduce a framework for improving learning in message passing neural networks.
Our approach essentially upsamples edges in the original graph by adding "slow nodes" at each edge.
Our method only modifies the input graph, making it plug-and-play and easy to use with existing models.
arXiv Detail & Related papers (2023-08-17T22:24:15Z) - Temporal Aggregation and Propagation Graph Neural Networks for Dynamic
Representation [67.26422477327179]
Temporal graphs exhibit dynamic interactions between nodes over continuous time.
We propose a novel method of temporal graph convolution with the whole neighborhood.
Our proposed TAP-GNN outperforms existing temporal graph methods by a large margin in terms of both predictive performance and online inference latency.
arXiv Detail & Related papers (2023-04-15T08:17:18Z) - Dynamic Graph Message Passing Networks for Visual Recognition [112.49513303433606]
Modelling long-range dependencies is critical for scene understanding tasks in computer vision.
A fully-connected graph is beneficial for such modelling, but its computational overhead is prohibitive.
We propose a dynamic graph message passing network, that significantly reduces the computational complexity.
arXiv Detail & Related papers (2022-09-20T14:41:37Z) - VQ-GNN: A Universal Framework to Scale up Graph Neural Networks using
Vector Quantization [70.8567058758375]
VQ-GNN is a universal framework to scale up any convolution-based GNNs using Vector Quantization (VQ) without compromising the performance.
Our framework avoids the "neighbor explosion" problem of GNNs using quantized representations combined with a low-rank version of the graph convolution matrix.
arXiv Detail & Related papers (2021-10-27T11:48:50Z) - DPGNN: Dual-Perception Graph Neural Network for Representation Learning [21.432960458513826]
Graph neural networks (GNNs) have drawn increasing attention in recent years and achieved remarkable performance in many graph-based tasks.
Most existing GNNs are based on the message-passing paradigm to iteratively aggregate neighborhood information in a single topology space.
We present a novel message-passing paradigm, based on the properties of multi-step message source, node-specific message output, and multi-space message interaction.
arXiv Detail & Related papers (2021-10-15T05:47:26Z) - GMLP: Building Scalable and Flexible Graph Neural Networks with
Feature-Message Passing [16.683813354137254]
Graph Multi-layer Perceptron (GMLP) separates the neural update from the message passing.
We conduct extensive evaluations on 11 benchmark datasets.
arXiv Detail & Related papers (2021-04-20T10:19:21Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.