GIPA++: A General Information Propagation Algorithm for Graph Learning
- URL: http://arxiv.org/abs/2301.08209v1
- Date: Thu, 19 Jan 2023 18:00:51 GMT
- Title: GIPA++: A General Information Propagation Algorithm for Graph Learning
- Authors: Houyi Li, Zhihong Chen, Zhao Li, Qinkai Zheng, Peng Zhang, Shuigeng
Zhou
- Abstract summary: Graph neural networks (GNNs) have been widely used in graph-structured data computation.
We propose a General Information Propagation Algorithm (GIPA) to exploit more fine-grained information fusion.
GIPA exploits bit-wise and feature-wise correlations based on edge features in their propagation.
- Score: 34.0393139910052
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Graph neural networks (GNNs) have been widely used in graph-structured data
computation, showing promising performance in various applications such as node
classification, link prediction, and network recommendation. Existing works
mainly focus on node-wise correlation when doing weighted aggregation of
neighboring nodes based on attention, such as dot product by the dense vectors
of two nodes. This may cause conflicting noise in nodes to be propagated when
doing information propagation. To solve this problem, we propose a General
Information Propagation Algorithm (GIPA in short), which exploits more
fine-grained information fusion including bit-wise and feature-wise
correlations based on edge features in their propagation. Specifically, the
bit-wise correlation calculates the element-wise attention weight through a
multi-layer perceptron (MLP) based on the dense representations of two nodes
and their edge; The feature-wise correlation is based on the one-hot
representations of node attribute features for feature selection. We evaluate
the performance of GIPA on the Open Graph Benchmark proteins (OGBN-proteins for
short) dataset and the Alipay dataset of Alibaba. Experimental results reveal
that GIPA outperforms the state-of-the-art models in terms of prediction
accuracy, e.g., GIPA achieves an average ROC-AUC of $0.8901\pm 0.0011$, which
is better than that of all the existing methods listed in the OGBN-proteins
leaderboard.
Related papers
- Cluster-based Graph Collaborative Filtering [55.929052969825825]
Graph Convolution Networks (GCNs) have succeeded in learning user and item representations for recommendation systems.
Most existing GCN-based methods overlook the multiple interests of users while performing high-order graph convolution.
We propose a novel GCN-based recommendation model, termed Cluster-based Graph Collaborative Filtering (ClusterGCF)
arXiv Detail & Related papers (2024-04-16T07:05:16Z) - Distributed Learning over Networks with Graph-Attention-Based
Personalization [49.90052709285814]
We propose a graph-based personalized algorithm (GATTA) for distributed deep learning.
In particular, the personalized model in each agent is composed of a global part and a node-specific part.
By treating each agent as one node in a graph the node-specific parameters as its features, the benefits of the graph attention mechanism can be inherited.
arXiv Detail & Related papers (2023-05-22T13:48:30Z) - Graph Ordering Attention Networks [22.468776559433614]
Graph Neural Networks (GNNs) have been successfully used in many problems involving graph-structured data.
We introduce the Graph Ordering Attention (GOAT) layer, a novel GNN component that captures interactions between nodes in a neighborhood.
GOAT layer demonstrates its increased performance in modeling graph metrics that capture complex information.
arXiv Detail & Related papers (2022-04-11T18:13:19Z) - Exploiting Neighbor Effect: Conv-Agnostic GNNs Framework for Graphs with
Heterophily [58.76759997223951]
We propose a new metric based on von Neumann entropy to re-examine the heterophily problem of GNNs.
We also propose a Conv-Agnostic GNN framework (CAGNNs) to enhance the performance of most GNNs on heterophily datasets.
arXiv Detail & Related papers (2022-03-19T14:26:43Z) - Graph Neural Networks with Feature and Structure Aware Random Walk [7.143879014059894]
We show that in typical heterphilous graphs, the edges may be directed, and whether to treat the edges as is or simply make them undirected greatly affects the performance of the GNN models.
We develop a model that adaptively learns the directionality of the graph, and exploits the underlying long-distance correlations between nodes.
arXiv Detail & Related papers (2021-11-19T08:54:21Z) - Explicit Pairwise Factorized Graph Neural Network for Semi-Supervised
Node Classification [59.06717774425588]
We propose the Explicit Pairwise Factorized Graph Neural Network (EPFGNN), which models the whole graph as a partially observed Markov Random Field.
It contains explicit pairwise factors to model output-output relations and uses a GNN backbone to model input-output relations.
We conduct experiments on various datasets, which shows that our model can effectively improve the performance for semi-supervised node classification on graphs.
arXiv Detail & Related papers (2021-07-27T19:47:53Z) - On Local Aggregation in Heterophilic Graphs [11.100606980915144]
We show that properly tuned classical GNNs and multi-layer perceptrons match or exceed the accuracy of recent long-range aggregation methods on heterophilic graphs.
We propose the Neighborhood Information Content(NIC) metric, which is a novel information-theoretic graph metric.
arXiv Detail & Related papers (2021-06-06T19:12:31Z) - GIPA: General Information Propagation Algorithm for Graph Learning [3.228614352581043]
We present a new graph attention neural network, namely GIPA, for attributed graph data learning.
GIPA consists of three key components: attention, feature propagation and aggregation.
We evaluate the performance of GIPA using the Open Graph Benchmark proteins dataset.
arXiv Detail & Related papers (2021-05-13T01:50:43Z) - Unifying Graph Convolutional Neural Networks and Label Propagation [73.82013612939507]
We study the relationship between LPA and GCN in terms of two aspects: feature/label smoothing and feature/label influence.
Based on our theoretical analysis, we propose an end-to-end model that unifies GCN and LPA for node classification.
Our model can also be seen as learning attention weights based on node labels, which is more task-oriented than existing feature-based attention models.
arXiv Detail & Related papers (2020-02-17T03:23:13Z) - Bilinear Graph Neural Network with Neighbor Interactions [106.80781016591577]
Graph Neural Network (GNN) is a powerful model to learn representations and make predictions on graph data.
We propose a new graph convolution operator, which augments the weighted sum with pairwise interactions of the representations of neighbor nodes.
We term this framework as Bilinear Graph Neural Network (BGNN), which improves GNN representation ability with bilinear interactions between neighbor nodes.
arXiv Detail & Related papers (2020-02-10T06:43:38Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.