ConstGCN: Constrained Transmission-based Graph Convolutional Networks
for Document-level Relation Extraction
- URL: http://arxiv.org/abs/2210.03949v1
- Date: Sat, 8 Oct 2022 07:36:04 GMT
- Title: ConstGCN: Constrained Transmission-based Graph Convolutional Networks
for Document-level Relation Extraction
- Authors: Ji Qi, Bin Xu, Kaisheng Zeng, Jinxin Liu, Jifan Yu, Qi Gao, Juanzi Li,
Lei Hou
- Abstract summary: Document-level relation extraction with graph neural networks faces a fundamental graph construction gap between training and inference.
We propose $textbfConstGCN$, a novel graph convolutional network which performs knowledge-based information propagation between entities.
Experimental results show that our method outperforms the previous state-of-the-art (SOTA) approaches on the DocRE dataset.
- Score: 24.970508961370548
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Document-level relation extraction with graph neural networks faces a
fundamental graph construction gap between training and inference - the golden
graph structure only available during training, which causes that most methods
adopt heuristic or syntactic rules to construct a prior graph as a pseudo
proxy. In this paper, we propose $\textbf{ConstGCN}$, a novel graph
convolutional network which performs knowledge-based information propagation
between entities along with all specific relation spaces without any prior
graph construction. Specifically, it updates the entity representation by
aggregating information from all other entities along with each relation space,
thus modeling the relation-aware spatial information. To control the
information flow passing through the indeterminate relation spaces, we propose
to constrain the propagation using transmitting scores learned from the Noise
Contrastive Estimation between fact triples. Experimental results show that our
method outperforms the previous state-of-the-art (SOTA) approaches on the DocRE
dataset.
Related papers
- DGNN: Decoupled Graph Neural Networks with Structural Consistency
between Attribute and Graph Embedding Representations [62.04558318166396]
Graph neural networks (GNNs) demonstrate a robust capability for representation learning on graphs with complex structures.
A novel GNNs framework, dubbed Decoupled Graph Neural Networks (DGNN), is introduced to obtain a more comprehensive embedding representation of nodes.
Experimental results conducted on several graph benchmark datasets verify DGNN's superiority in node classification task.
arXiv Detail & Related papers (2024-01-28T06:43:13Z) - You Only Transfer What You Share: Intersection-Induced Graph Transfer
Learning for Link Prediction [79.15394378571132]
We investigate a previously overlooked phenomenon: in many cases, a densely connected, complementary graph can be found for the original graph.
The denser graph may share nodes with the original graph, which offers a natural bridge for transferring selective, meaningful knowledge.
We identify this setting as Graph Intersection-induced Transfer Learning (GITL), which is motivated by practical applications in e-commerce or academic co-authorship predictions.
arXiv Detail & Related papers (2023-02-27T22:56:06Z) - Knowledge Graph Embedding using Graph Convolutional Networks with
Relation-Aware Attention [3.803929794912623]
Knowledge graph embedding methods learn embeddings of entities and relations in a low dimensional space.
Various graph convolutional network methods have been proposed which use different types of information to learn the features of entities and relations.
We propose a relation-aware graph attention model that leverages relation information to compute different weights to the neighboring nodes for learning embeddings of entities and relations.
arXiv Detail & Related papers (2021-02-14T17:19:44Z) - Learning the Implicit Semantic Representation on Graph-Structured Data [57.670106959061634]
Existing representation learning methods in graph convolutional networks are mainly designed by describing the neighborhood of each node as a perceptual whole.
We propose a Semantic Graph Convolutional Networks (SGCN) that explores the implicit semantics by learning latent semantic-paths in graphs.
arXiv Detail & Related papers (2021-01-16T16:18:43Z) - Graph Information Bottleneck [77.21967740646784]
Graph Neural Networks (GNNs) provide an expressive way to fuse information from network structure and node features.
Inheriting from the general Information Bottleneck (IB), GIB aims to learn the minimal sufficient representation for a given task.
We show that our proposed models are more robust than state-of-the-art graph defense models.
arXiv Detail & Related papers (2020-10-24T07:13:00Z) - Graph Fairing Convolutional Networks for Anomaly Detection [7.070726553564701]
We introduce a graph convolutional network with skip connections for semi-supervised anomaly detection.
The effectiveness of our model is demonstrated through extensive experiments on five benchmark datasets.
arXiv Detail & Related papers (2020-10-20T13:45:47Z) - Contrastive and Generative Graph Convolutional Networks for Graph-based
Semi-Supervised Learning [64.98816284854067]
Graph-based Semi-Supervised Learning (SSL) aims to transfer the labels of a handful of labeled data to the remaining massive unlabeled data via a graph.
A novel GCN-based SSL algorithm is presented in this paper to enrich the supervision signals by utilizing both data similarities and graph structure.
arXiv Detail & Related papers (2020-09-15T13:59:28Z) - Affinity Graph Supervision for Visual Recognition [35.35959846458965]
We propose a principled method to supervise the learning of weights in affinity graphs.
Our affinity supervision improves relationship recovery between objects, even without manually annotated relationship labels.
We show that affinity learning can also be applied to graphs built from mini-batches, for neural network training.
arXiv Detail & Related papers (2020-03-19T23:52:51Z) - Tensor Graph Convolutional Networks for Multi-relational and Robust
Learning [74.05478502080658]
This paper introduces a tensor-graph convolutional network (TGCN) for scalable semi-supervised learning (SSL) from data associated with a collection of graphs, that are represented by a tensor.
The proposed architecture achieves markedly improved performance relative to standard GCNs, copes with state-of-the-art adversarial attacks, and leads to remarkable SSL performance over protein-to-protein interaction networks.
arXiv Detail & Related papers (2020-03-15T02:33:21Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.