Two-view Graph Neural Networks for Knowledge Graph Completion
- URL: http://arxiv.org/abs/2112.09231v1
- Date: Thu, 16 Dec 2021 22:36:17 GMT
- Title: Two-view Graph Neural Networks for Knowledge Graph Completion
- Authors: Vinh Tong and Dai Quoc Nguyen and Dinh Phung and Dat Quoc Nguyen
- Abstract summary: We introduce a novel GNN-based knowledge graph embedding model, named WGE, to capture entity-focused graph structure and relation-focused graph structure.
WGE obtains state-of-the-art performances on three new and challenging benchmark datasets CoDEx for knowledge graph completion.
- Score: 13.934907240846197
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In this paper, we introduce a novel GNN-based knowledge graph embedding
model, named WGE, to capture entity-focused graph structure and
relation-focused graph structure. In particular, given the knowledge graph, WGE
builds a single undirected entity-focused graph that views entities as nodes.
In addition, WGE also constructs another single undirected graph from
relation-focused constraints, which views entities and relations as nodes. WGE
then proposes a new architecture of utilizing two vanilla GNNs directly on
these two single graphs to better update vector representations of entities and
relations, followed by a weighted score function to return the triple scores.
Experimental results show that WGE obtains state-of-the-art performances on
three new and challenging benchmark datasets CoDEx for knowledge graph
completion.
Related papers
- Graph External Attention Enhanced Transformer [20.44782028691701]
We propose Graph External Attention (GEA) -- a novel attention mechanism that leverages multiple external node/edge key-value units to capture inter-graph correlations implicitly.
On this basis, we design an effective architecture called Graph External Attention Enhanced Transformer (GEAET)
Experiments on benchmark datasets demonstrate that GEAET achieves state-of-the-art empirical performance.
arXiv Detail & Related papers (2024-05-31T17:50:27Z) - Relating-Up: Advancing Graph Neural Networks through Inter-Graph Relationships [17.978546172777342]
Graph Neural Networks (GNNs) have excelled in learning from graph-structured data.
Despite their successes, GNNs are limited by neglecting the context of relationships across graphs.
We introduce Relating-Up, a plug-and-play module that enhances GNNs by exploiting inter-graph relationships.
arXiv Detail & Related papers (2024-05-07T02:16:54Z) - DGNN: Decoupled Graph Neural Networks with Structural Consistency
between Attribute and Graph Embedding Representations [62.04558318166396]
Graph neural networks (GNNs) demonstrate a robust capability for representation learning on graphs with complex structures.
A novel GNNs framework, dubbed Decoupled Graph Neural Networks (DGNN), is introduced to obtain a more comprehensive embedding representation of nodes.
Experimental results conducted on several graph benchmark datasets verify DGNN's superiority in node classification task.
arXiv Detail & Related papers (2024-01-28T06:43:13Z) - Graph Transformer GANs with Graph Masked Modeling for Architectural
Layout Generation [153.92387500677023]
We present a novel graph Transformer generative adversarial network (GTGAN) to learn effective graph node relations.
The proposed graph Transformer encoder combines graph convolutions and self-attentions in a Transformer to model both local and global interactions.
We also propose a novel self-guided pre-training method for graph representation learning.
arXiv Detail & Related papers (2024-01-15T14:36:38Z) - Graph Transformer GANs for Graph-Constrained House Generation [223.739067413952]
We present a novel graph Transformer generative adversarial network (GTGAN) to learn effective graph node relations.
The GTGAN learns effective graph node relations in an end-to-end fashion for the challenging graph-constrained house generation task.
arXiv Detail & Related papers (2023-03-14T20:35:45Z) - GraphDCA -- a Framework for Node Distribution Comparison in Real and
Synthetic Graphs [72.51835626235368]
We argue that when comparing two graphs, the distribution of node structural features is more informative than global graph statistics.
We present GraphDCA - a framework for evaluating similarity between graphs based on the alignment of their respective node representation sets.
arXiv Detail & Related papers (2022-02-08T14:19:19Z) - Node Co-occurrence based Graph Neural Networks for Knowledge Graph Link
Prediction [13.934907240846197]
NoKE aims to integrate co-occurrence among entities and relations into graph neural networks to improve knowledge graph completion.
NoKE obtains state-of-the-art results on three new, challenging, and difficult benchmark datasets CoDEx for knowledge graph completion.
arXiv Detail & Related papers (2021-04-15T11:51:52Z) - Graph Information Bottleneck [77.21967740646784]
Graph Neural Networks (GNNs) provide an expressive way to fuse information from network structure and node features.
Inheriting from the general Information Bottleneck (IB), GIB aims to learn the minimal sufficient representation for a given task.
We show that our proposed models are more robust than state-of-the-art graph defense models.
arXiv Detail & Related papers (2020-10-24T07:13:00Z) - Bridging Knowledge Graphs to Generate Scene Graphs [49.69377653925448]
We propose a novel graph-based neural network that iteratively propagates information between the two graphs, as well as within each of them.
Our Graph Bridging Network, GB-Net, successively infers edges and nodes, allowing to simultaneously exploit and refine the rich, heterogeneous structure of the interconnected scene and commonsense graphs.
arXiv Detail & Related papers (2020-01-07T23:35:52Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.