Differential Encoding for Improved Representation Learning over Graphs
- URL: http://arxiv.org/abs/2407.02758v1
- Date: Wed, 3 Jul 2024 02:23:33 GMT
- Title: Differential Encoding for Improved Representation Learning over Graphs
- Authors: Haimin Zhang, Jiahao Xia, Min Xu,
- Abstract summary: A message-passing paradigm and a global attention mechanism fundamentally generate node embeddings.
It is unknown if the dominant information is from a node itself or from the node's neighbours.
We present a differential encoding method to address the issue of information lost.
- Score: 15.791455338513815
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Combining the message-passing paradigm with the global attention mechanism has emerged as an effective framework for learning over graphs. The message-passing paradigm and the global attention mechanism fundamentally generate node embeddings based on information aggregated from a node's local neighborhood or from the whole graph. The most basic and commonly used aggregation approach is to take the sum of information from a node's local neighbourhood or from the whole graph. However, it is unknown if the dominant information is from a node itself or from the node's neighbours (or the rest of the graph nodes). Therefore, there exists information lost at each layer of embedding generation, and this information lost could be accumulated and become more serious when more layers are used in the model. In this paper, we present a differential encoding method to address the issue of information lost. The idea of our method is to encode the differential representation between the information from a node's neighbours (or the rest of the graph nodes) and that from the node itself. The obtained differential encoding is then combined with the original aggregated local or global representation to generate the updated node embedding. By integrating differential encodings, the representational ability of generated node embeddings is improved. The differential encoding method is empirically evaluated on different graph tasks on seven benchmark datasets. The results show that it is a general method that improves the message-passing update and the global attention update, advancing the state-of-the-art performance for graph representation learning on these datasets.
Related papers
- Federated Graph Semantic and Structural Learning [54.97668931176513]
This paper reveals that local client distortion is brought by both node-level semantics and graph-level structure.
We postulate that a well-structural graph neural network possesses similarity for neighbors due to the inherent adjacency relationships.
We transform the adjacency relationships into the similarity distribution and leverage the global model to distill the relation knowledge into the local model.
arXiv Detail & Related papers (2024-06-27T07:08:28Z) - Neighbour-level Message Interaction Encoding for Improved Representation Learning on Graphs [13.83680253264399]
We propose a neighbour-level message interaction information encoding method for improving graph representation learning.
The proposed encoding method is a generic method which can be integrated into message-passing graph convolutional networks.
arXiv Detail & Related papers (2024-04-15T14:07:33Z) - Graph Transformer GANs with Graph Masked Modeling for Architectural
Layout Generation [153.92387500677023]
We present a novel graph Transformer generative adversarial network (GTGAN) to learn effective graph node relations.
The proposed graph Transformer encoder combines graph convolutions and self-attentions in a Transformer to model both local and global interactions.
We also propose a novel self-guided pre-training method for graph representation learning.
arXiv Detail & Related papers (2024-01-15T14:36:38Z) - Deep Manifold Graph Auto-Encoder for Attributed Graph Embedding [51.75091298017941]
This paper proposes a novel Deep Manifold (Variational) Graph Auto-Encoder (DMVGAE/DMGAE) for attributed graph data.
The proposed method surpasses state-of-the-art baseline algorithms by a significant margin on different downstream tasks across popular datasets.
arXiv Detail & Related papers (2024-01-12T17:57:07Z) - Local Structure-aware Graph Contrastive Representation Learning [12.554113138406688]
We propose a Local Structure-aware Graph Contrastive representation Learning method (LS-GCL) to model the structural information of nodes from multiple views.
For the local view, the semantic subgraph of each target node is input into a shared GNN encoder to obtain the target node embeddings at the subgraph-level.
For the global view, considering the original graph preserves indispensable semantic information of nodes, we leverage the shared GNN encoder to learn the target node embeddings at the global graph-level.
arXiv Detail & Related papers (2023-08-07T03:23:46Z) - StarGraph: A Coarse-to-Fine Representation Method for Large-Scale
Knowledge Graph [0.6445605125467573]
We propose a method named StarGraph, which gives a novel way to utilize the neighborhood information for large-scale knowledge graphs.
The proposed method achieves the best results on the ogbl-wikikg2 dataset, which validates the effectiveness of it.
arXiv Detail & Related papers (2022-05-27T19:32:45Z) - LiftPool: Lifting-based Graph Pooling for Hierarchical Graph
Representation Learning [53.176603566951016]
We propose an enhanced three-stage method via lifting, named LiftPool, to improve hierarchical graph representation.
For each node to be removed, its local information is obtained by subtracting the global information aggregated from its neighboring preserved nodes.
Evaluations on benchmark graph datasets show that LiftPool substantially outperforms the state-of-the-art graph pooling methods in the task of graph classification.
arXiv Detail & Related papers (2022-04-27T12:38:02Z) - Node-wise Localization of Graph Neural Networks [52.04194209002702]
Graph neural networks (GNNs) emerge as a powerful family of representation learning models on graphs.
We propose a node-wise localization of GNNs by accounting for both global and local aspects of the graph.
We conduct extensive experiments on four benchmark graphs, and consistently obtain promising performance surpassing the state-of-the-art GNNs.
arXiv Detail & Related papers (2021-10-27T10:02:03Z) - Locality Preserving Dense Graph Convolutional Networks with Graph
Context-Aware Node Representations [19.623379678611744]
Graph convolutional networks (GCNs) have been widely used for representation learning on graph data.
In many graph classification applications, GCN-based approaches have outperformed traditional methods.
We propose a locality-preserving dense GCN with graph context-aware node representations.
arXiv Detail & Related papers (2020-10-12T02:12:27Z) - Graph InfoClust: Leveraging cluster-level node information for
unsupervised graph representation learning [12.592903558338444]
We propose a graph representation learning method called Graph InfoClust.
It seeks to additionally capture cluster-level information content.
This optimization leads the node representations to capture richer information and nodal interactions, which improves their quality.
arXiv Detail & Related papers (2020-09-15T09:33:20Z) - Modeling Global and Local Node Contexts for Text Generation from
Knowledge Graphs [63.12058935995516]
Recent graph-to-text models generate text from graph-based data using either global or local aggregation.
We propose novel neural models which encode an input graph combining both global and local node contexts.
Our approaches lead to significant improvements on two graph-to-text datasets.
arXiv Detail & Related papers (2020-01-29T18:24:14Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.