Efficient long-distance relation extraction with DG-SpanBERT
- URL: http://arxiv.org/abs/2004.03636v1
- Date: Tue, 7 Apr 2020 18:21:47 GMT
- Title: Efficient long-distance relation extraction with DG-SpanBERT
- Authors: Jun Chen, Robert Hoehndorf, Mohamed Elhoseiny and Xiangliang Zhang
- Abstract summary: In natural language processing, relation extraction seeks to rationally understand unstructured text.
We propose a novel SpanBERT-based graph convolutional network (DG-SpanBERT) that extracts semantic features from a raw sentence.
Our model outperforms other existing dependency-based and sequence-based models and achieves a state-of-the-art performance on the TACRED dataset.
- Score: 46.07868542443406
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: In natural language processing, relation extraction seeks to rationally
understand unstructured text. Here, we propose a novel SpanBERT-based graph
convolutional network (DG-SpanBERT) that extracts semantic features from a raw
sentence using the pre-trained language model SpanBERT and a graph
convolutional network to pool latent features. Our DG-SpanBERT model inherits
the advantage of SpanBERT on learning rich lexical features from large-scale
corpus. It also has the ability to capture long-range relations between
entities due to the usage of GCN on dependency tree. The experimental results
show that our model outperforms other existing dependency-based and
sequence-based models and achieves a state-of-the-art performance on the TACRED
dataset.
Related papers
- GraphER: A Structure-aware Text-to-Graph Model for Entity and Relation Extraction [3.579132482505273]
Information extraction is an important task in Natural Language Processing (NLP)
We propose a novel approach to this task by formulating it as graph structure learning (GSL)
This formulation allows for better interaction and structure-informed decisions for entity and relation prediction.
arXiv Detail & Related papers (2024-04-18T20:09:37Z) - Relational Graph Convolutional Networks for Sentiment Analysis [0.0]
Graph Convolutional Networks (NRGCs) offer interpretability and flexibility by capturing dependencies between data points represented as nodes in a graph.
We demonstrate the effectiveness of our approach by using pre-trained language models such as BERT and RoBERTa with RGCN architecture on product reviews from Amazon and Digikala datasets.
arXiv Detail & Related papers (2024-04-16T07:27:49Z) - Relational Extraction on Wikipedia Tables using Convolutional and Memory
Networks [6.200672130699805]
Relation extraction (RE) is the task of extracting relations between entities in text.
We introduce a new model consisting of Convolutional Neural Network (CNN) and Bidirectional-Long Short Term Memory (BiLSTM) network to encode entities.
arXiv Detail & Related papers (2023-07-11T22:36:47Z) - Ordinal Graph Gamma Belief Network for Social Recommender Systems [54.9487910312535]
We develop a hierarchical Bayesian model termed ordinal graph factor analysis (OGFA), which jointly models user-item and user-user interactions.
OGFA not only achieves good recommendation performance, but also extracts interpretable latent factors corresponding to representative user preferences.
We extend OGFA to ordinal graph gamma belief network, which is a multi-stochastic-layer deep probabilistic model.
arXiv Detail & Related papers (2022-09-12T09:19:22Z) - EIGNN: Efficient Infinite-Depth Graph Neural Networks [51.97361378423152]
Graph neural networks (GNNs) are widely used for modelling graph-structured data in numerous applications.
Motivated by this limitation, we propose a GNN model with infinite depth, which we call Efficient Infinite-Depth Graph Neural Networks (EIGNN)
We show that EIGNN has a better ability to capture long-range dependencies than recent baselines, and consistently achieves state-of-the-art performance.
arXiv Detail & Related papers (2022-02-22T08:16:58Z) - MrGCN: Mirror Graph Convolution Network for Relation Extraction with
Long-Term Dependencies [32.27755470353054]
In relation extraction, dependency trees that contain rich syntactic clues have been widely used to help capture long-term dependencies in text.
We propose the Mirror Graph Convolution Network (MrGCN), a GNN model with pooling-unpooling structures tailored to relation extraction.
Experiments on two datasets demonstrate the effectiveness of our method, showing significant improvements over previous results.
arXiv Detail & Related papers (2021-01-01T00:52:53Z) - Deep Reinforcement Learning of Graph Matching [63.469961545293756]
Graph matching (GM) under node and pairwise constraints has been a building block in areas from optimization to computer vision.
We present a reinforcement learning solver for GM i.e. RGM that seeks the node correspondence between pairwise graphs.
Our method differs from the previous deep graph matching model in the sense that they are focused on the front-end feature extraction and affinity function learning.
arXiv Detail & Related papers (2020-12-16T13:48:48Z) - Coreferential Reasoning Learning for Language Representation [88.14248323659267]
We present CorefBERT, a novel language representation model that can capture the coreferential relations in context.
The experimental results show that, compared with existing baseline models, CorefBERT can achieve significant improvements consistently on various downstream NLP tasks.
arXiv Detail & Related papers (2020-04-15T03:57:45Z) - A Dependency Syntactic Knowledge Augmented Interactive Architecture for
End-to-End Aspect-based Sentiment Analysis [73.74885246830611]
We propose a novel dependency syntactic knowledge augmented interactive architecture with multi-task learning for end-to-end ABSA.
This model is capable of fully exploiting the syntactic knowledge (dependency relations and types) by leveraging a well-designed Dependency Relation Embedded Graph Convolutional Network (DreGcn)
Extensive experimental results on three benchmark datasets demonstrate the effectiveness of our approach.
arXiv Detail & Related papers (2020-04-04T14:59:32Z) - Stochastic Natural Language Generation Using Dependency Information [0.7995360025953929]
This article presents a corpus-based model for generating natural language text.
Our model encodes dependency relations from training data through a feature set, then produces a new dependency tree for a given meaning representation.
We show that our model produces high-quality utterances in aspects of informativeness and naturalness as well as quality.
arXiv Detail & Related papers (2020-01-12T09:40:11Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.