CoRGi: Content-Rich Graph Neural Networks with Attention
- URL: http://arxiv.org/abs/2110.04866v1
- Date: Sun, 10 Oct 2021 17:54:30 GMT
- Title: CoRGi: Content-Rich Graph Neural Networks with Attention
- Authors: Jooyeon Kim, Angus Lamb, Simon Woodhead, Simon Peyton Jones, Cheng
Zheng, Miltiadis Allamanis
- Abstract summary: We present CoRGi, a graph neural network that considers the rich data within nodes in the context of their neighbors.
This is achieved by endowing CoRGi's message passing with a personalized attention mechanism over the content of each node.
We show that CoRGi is better at making edge-value predictions over existing methods, especially on sparse regions of the graph.
- Score: 12.339385456449659
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Graph representations of a target domain often project it to a set of
entities (nodes) and their relations (edges). However, such projections often
miss important and rich information. For example, in graph representations used
in missing value imputation, items - represented as nodes - may contain rich
textual information. However, when processing graphs with graph neural networks
(GNN), such information is either ignored or summarized into a single vector
representation used to initialize the GNN. Towards addressing this, we present
CoRGi, a GNN that considers the rich data within nodes in the context of their
neighbors. This is achieved by endowing CoRGi's message passing with a
personalized attention mechanism over the content of each node. This way, CoRGi
assigns user-item-specific attention scores with respect to the words that
appear in an item's content. We evaluate CoRGi on two edge-value prediction
tasks and show that CoRGi is better at making edge-value predictions over
existing methods, especially on sparse regions of the graph.
Related papers
- Seq-HGNN: Learning Sequential Node Representation on Heterogeneous Graph [57.2953563124339]
We propose a novel heterogeneous graph neural network with sequential node representation, namely Seq-HGNN.
We conduct extensive experiments on four widely used datasets from Heterogeneous Graph Benchmark (HGB) and Open Graph Benchmark (OGB)
arXiv Detail & Related papers (2023-05-18T07:27:18Z) - Neural Graph Matching for Pre-training Graph Neural Networks [72.32801428070749]
Graph neural networks (GNNs) have been shown powerful capacity at modeling structural data.
We present a novel Graph Matching based GNN Pre-Training framework, called GMPT.
The proposed method can be applied to fully self-supervised pre-training and coarse-grained supervised pre-training.
arXiv Detail & Related papers (2022-03-03T09:53:53Z) - Structure Enhanced Graph Neural Networks for Link Prediction [6.872826041648584]
We propose Structure Enhanced Graph neural network (SEG) for link prediction.
SEG incorporates surrounding topological information of target nodes into an ordinary GNN model.
Experiments on the OGB link prediction datasets demonstrate that SEG achieves state-of-the-art results.
arXiv Detail & Related papers (2022-01-14T03:49:30Z) - Graph Neural Networks with Feature and Structure Aware Random Walk [7.143879014059894]
We show that in typical heterphilous graphs, the edges may be directed, and whether to treat the edges as is or simply make them undirected greatly affects the performance of the GNN models.
We develop a model that adaptively learns the directionality of the graph, and exploits the underlying long-distance correlations between nodes.
arXiv Detail & Related papers (2021-11-19T08:54:21Z) - Hierarchical graph neural nets can capture long-range interactions [8.067880298298185]
We study hierarchical message passing models that leverage a multi-resolution representation of a given graph.
This facilitates learning of features that span large receptive fields without loss of local information.
We introduce Hierarchical Graph Net (HGNet), which for any two connected nodes guarantees existence of message-passing paths of at most logarithmic length.
arXiv Detail & Related papers (2021-07-15T16:24:22Z) - Graph Attention Networks with Positional Embeddings [7.552100672006174]
Graph Neural Networks (GNNs) are deep learning methods which provide the current state of the art performance in node classification tasks.
We propose a framework, termed Graph Attentional Networks with Positional Embeddings (GAT-POS), to enhance GATs with positional embeddings.
We show that GAT-POS reaches remarkable improvement compared to strong GNN baselines and recent structural embedding enhanced GNNs on non-homophilic graphs.
arXiv Detail & Related papers (2021-05-09T22:13:46Z) - Factorizable Graph Convolutional Networks [90.59836684458905]
We introduce a novel graph convolutional network (GCN) that explicitly disentangles intertwined relations encoded in a graph.
FactorGCN takes a simple graph as input, and disentangles it into several factorized graphs.
We evaluate the proposed FactorGCN both qualitatively and quantitatively on the synthetic and real-world datasets.
arXiv Detail & Related papers (2020-10-12T03:01:40Z) - Higher-Order Explanations of Graph Neural Networks via Relevant Walks [3.1510406584101776]
Graph Neural Networks (GNNs) are a popular approach for predicting graph structured data.
In this paper, we show that GNNs can in fact be naturally explained using higher-order expansions.
We extract practically relevant insights on sentiment analysis of text data, structure-property relationships in quantum chemistry, and image classification.
arXiv Detail & Related papers (2020-06-05T17:59:14Z) - Iterative Context-Aware Graph Inference for Visual Dialog [126.016187323249]
We propose a novel Context-Aware Graph (CAG) neural network.
Each node in the graph corresponds to a joint semantic feature, including both object-based (visual) and history-related (textual) context representations.
arXiv Detail & Related papers (2020-04-05T13:09:37Z) - Graphs, Convolutions, and Neural Networks: From Graph Filters to Graph
Neural Networks [183.97265247061847]
We leverage graph signal processing to characterize the representation space of graph neural networks (GNNs)
We discuss the role of graph convolutional filters in GNNs and show that any architecture built with such filters has the fundamental properties of permutation equivariance and stability to changes in the topology.
We also study the use of GNNs in recommender systems and learning decentralized controllers for robot swarms.
arXiv Detail & Related papers (2020-03-08T13:02:15Z) - EdgeNets:Edge Varying Graph Neural Networks [179.99395949679547]
This paper puts forth a general framework that unifies state-of-the-art graph neural networks (GNNs) through the concept of EdgeNet.
An EdgeNet is a GNN architecture that allows different nodes to use different parameters to weigh the information of different neighbors.
This is a general linear and local operation that a node can perform and encompasses under one formulation all existing graph convolutional neural networks (GCNNs) as well as graph attention networks (GATs)
arXiv Detail & Related papers (2020-01-21T15:51:17Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.