Neighbour-level Message Interaction Encoding for Improved Representation Learning on Graphs
- URL: http://arxiv.org/abs/2404.09809v1
- Date: Mon, 15 Apr 2024 14:07:33 GMT
- Title: Neighbour-level Message Interaction Encoding for Improved Representation Learning on Graphs
- Authors: Haimin Zhang, Min Xu,
- Abstract summary: We propose a neighbour-level message interaction information encoding method for improving graph representation learning.
The proposed encoding method is a generic method which can be integrated into message-passing graph convolutional networks.
- Score: 13.83680253264399
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Message passing has become the dominant framework in graph representation learning. The essential idea of the message-passing framework is to update node embeddings based on the information aggregated from local neighbours. However, most existing aggregation methods have not encoded neighbour-level message interactions into the aggregated message, resulting in an information lost in embedding generation. And this information lost could be accumulated and become more serious as more layers are added to the graph network model. To address this issue, we propose a neighbour-level message interaction information encoding method for improving graph representation learning. For messages that are aggregated at a node, we explicitly generate an encoding between each message and the rest messages using an encoding function. Then we aggregate these learned encodings and take the sum of the aggregated encoding and the aggregated message to update the embedding for the node. By this way, neighbour-level message interaction information is integrated into the generated node embeddings. The proposed encoding method is a generic method which can be integrated into message-passing graph convolutional networks. Extensive experiments are conducted on six popular benchmark datasets across four highly-demanded tasks. The results show that integrating neighbour-level message interactions achieves improved performance of the base models, advancing the state of the art results for representation learning over graphs.
Related papers
- Differential Encoding for Improved Representation Learning over Graphs [15.791455338513815]
A message-passing paradigm and a global attention mechanism fundamentally generate node embeddings.
It is unknown if the dominant information is from a node itself or from the node's neighbours.
We present a differential encoding method to address the issue of information lost.
arXiv Detail & Related papers (2024-07-03T02:23:33Z) - Bridging Local Details and Global Context in Text-Attributed Graphs [62.522550655068336]
GraphBridge is a framework that bridges local and global perspectives by leveraging contextual textual information.
Our method achieves state-of-theart performance, while our graph-aware token reduction module significantly enhances efficiency and solves scalability issues.
arXiv Detail & Related papers (2024-06-18T13:35:25Z) - Hierarchical Compression of Text-Rich Graphs via Large Language Models [63.75293588479027]
Text-rich graphs are prevalent in data mining contexts like e-commerce and academic graphs.
This paper introduces Hierarchical Compression'' (HiCom), a novel method to align the capabilities of LLMs with the structure of text-rich graphs.
HiCom can outperform both GNNs and LLM backbones for node classification on e-commerce and citation graphs.
arXiv Detail & Related papers (2024-06-13T07:24:46Z) - GGNNs : Generalizing GNNs using Residual Connections and Weighted
Message Passing [0.0]
GNNs excel at capturing relationships and patterns within graphs, enabling effective learning and prediction tasks.
It is commonly believed that the generalizing power of GNNs is attributed to the message-passing mechanism between layers.
Our technique builds on these results, modifying the message-passing mechanism further: one by weighing the messages before accumulating at each node and another by adding Residual connections.
arXiv Detail & Related papers (2023-11-26T22:22:38Z) - Conversational Semantic Parsing using Dynamic Context Graphs [68.72121830563906]
We consider the task of conversational semantic parsing over general purpose knowledge graphs (KGs) with millions of entities, and thousands of relation-types.
We focus on models which are capable of interactively mapping user utterances into executable logical forms.
arXiv Detail & Related papers (2023-05-04T16:04:41Z) - GraphFormers: GNN-nested Transformers for Representation Learning on
Textual Graph [53.70520466556453]
We propose GraphFormers, where layerwise GNN components are nested alongside the transformer blocks of language models.
With the proposed architecture, the text encoding and the graph aggregation are fused into an iterative workflow.
In addition, a progressive learning strategy is introduced, where the model is successively trained on manipulated data and original data to reinforce its capability of integrating information on graph.
arXiv Detail & Related papers (2021-05-06T12:20:41Z) - Exploiting Structured Knowledge in Text via Graph-Guided Representation
Learning [73.0598186896953]
We present two self-supervised tasks learning over raw text with the guidance from knowledge graphs.
Building upon entity-level masked language models, our first contribution is an entity masking scheme.
In contrast to existing paradigms, our approach uses knowledge graphs implicitly, only during pre-training.
arXiv Detail & Related papers (2020-04-29T14:22:42Z) - Modeling Global and Local Node Contexts for Text Generation from
Knowledge Graphs [63.12058935995516]
Recent graph-to-text models generate text from graph-based data using either global or local aggregation.
We propose novel neural models which encode an input graph combining both global and local node contexts.
Our approaches lead to significant improvements on two graph-to-text datasets.
arXiv Detail & Related papers (2020-01-29T18:24:14Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.