SIRE: Separate Intra- and Inter-sentential Reasoning for Document-level
Relation Extraction
- URL: http://arxiv.org/abs/2106.01709v1
- Date: Thu, 3 Jun 2021 09:25:44 GMT
- Title: SIRE: Separate Intra- and Inter-sentential Reasoning for Document-level
Relation Extraction
- Authors: Shuang Zeng, Yuting Wu and Baobao Chang
- Abstract summary: This paper proposes an effective architecture, SIRE, to represent intra- and inter-sentential relations in different ways.
Experiments on the public datasets show SIRE outperforms the previous state-of-the-art methods.
- Score: 16.106186007445146
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Document-level relation extraction has attracted much attention in recent
years. It is usually formulated as a classification problem that predicts
relations for all entity pairs in the document. However, previous works
indiscriminately represent intra- and inter-sentential relations in the same
way, confounding the different patterns for predicting them. Besides, they
create a document graph and use paths between entities on the graph as clues
for logical reasoning. However, not all entity pairs can be connected with a
path and have the correct logical reasoning paths in their graph. Thus many
cases of logical reasoning cannot be covered. This paper proposes an effective
architecture, SIRE, to represent intra- and inter-sentential relations in
different ways. We design a new and straightforward form of logical reasoning
module that can cover more logical reasoning chains. Experiments on the public
datasets show SIRE outperforms the previous state-of-the-art methods. Further
analysis shows that our predictions are reliable and explainable. Our code is
available at https://github.com/DreamInvoker/SIRE.
Related papers
- Coreference Graph Guidance for Mind-Map Generation [5.289044688419791]
Recently, a state-of-the-art method encodes the sentences of a document sequentially and converts them to a relation graph via sequence-to-graph.
We propose a coreference-guided mind-map generation network (CMGN) to incorporate external structure knowledge.
arXiv Detail & Related papers (2023-12-19T09:39:27Z) - Relation-Aware Question Answering for Heterogeneous Knowledge Graphs [37.38138785470231]
Existing retrieval-based approaches solve this task by concentrating on the specific relation at different hops.
We claim they fail to utilize information from head-tail entities and the semantic connection between relations to enhance the current relation representation.
Our approach achieves a significant performance gain over the prior state-of-the-art.
arXiv Detail & Related papers (2023-12-19T08:01:48Z) - You Only Transfer What You Share: Intersection-Induced Graph Transfer
Learning for Link Prediction [79.15394378571132]
We investigate a previously overlooked phenomenon: in many cases, a densely connected, complementary graph can be found for the original graph.
The denser graph may share nodes with the original graph, which offers a natural bridge for transferring selective, meaningful knowledge.
We identify this setting as Graph Intersection-induced Transfer Learning (GITL), which is motivated by practical applications in e-commerce or academic co-authorship predictions.
arXiv Detail & Related papers (2023-02-27T22:56:06Z) - Discourse-Aware Graph Networks for Textual Logical Reasoning [142.0097357999134]
Passage-level logical relations represent entailment or contradiction between propositional units (e.g., a concluding sentence)
We propose logic structural-constraint modeling to solve the logical reasoning QA and introduce discourse-aware graph networks (DAGNs)
The networks first construct logic graphs leveraging in-line discourse connectives and generic logic theories, then learn logic representations by end-to-end evolving the logic relations with an edge-reasoning mechanism and updating the graph features.
arXiv Detail & Related papers (2022-07-04T14:38:49Z) - Graph Collaborative Reasoning [18.45161138837384]
Graph Collaborative Reasoning (GCR) can use the neighbor link information for relational reasoning on graphs from logical reasoning perspectives.
We provide a simple approach to translate a graph structure into logical expressions, so that the link prediction task can be converted into a neural logic reasoning problem.
To show the effectiveness of our work, we conduct experiments on graph-related tasks such as link prediction and recommendation based on commonly used benchmark datasets.
arXiv Detail & Related papers (2021-12-27T14:27:58Z) - Discriminative Reasoning for Document-level Relation Extraction [28.593318203728963]
Document-level relation extraction (DocRE) models implicitly model the reasoning skill related to the relation between one entity pair in a document.
We propose a novel discriminative reasoning framework to explicitly model the paths of these reasoning skills between each entity pair in this document.
Experimental results show that our method outperforms the previous state-of-the-art performance on the large-scale DocRE dataset.
arXiv Detail & Related papers (2021-06-03T03:09:38Z) - ExplaGraphs: An Explanation Graph Generation Task for Structured
Commonsense Reasoning [65.15423587105472]
We present a new generative and structured commonsense-reasoning task (and an associated dataset) of explanation graph generation for stance prediction.
Specifically, given a belief and an argument, a model has to predict whether the argument supports or counters the belief and also generate a commonsense-augmented graph that serves as non-trivial, complete, and unambiguous explanation for the predicted stance.
A significant 83% of our graphs contain external commonsense nodes with diverse structures and reasoning depths.
arXiv Detail & Related papers (2021-04-15T17:51:36Z) - Double Graph Based Reasoning for Document-level Relation Extraction [29.19714611415326]
Document-level relation extraction aims to extract relations among entities within a document.
We propose Graph Aggregation-and-Inference Network (GAIN) featuring double graphs.
Experiments on the public dataset, DocRED, show GAIN achieves a significant performance improvement (2.85 on F1) over the previous state-of-the-art.
arXiv Detail & Related papers (2020-09-29T03:41:01Z) - A Simple Approach to Case-Based Reasoning in Knowledge Bases [56.661396189466664]
We present a surprisingly simple yet accurate approach to reasoning in knowledge graphs (KGs) that requires emphno training, and is reminiscent of case-based reasoning in classical artificial intelligence (AI)
Consider the task of finding a target entity given a source entity and a binary relation.
Our non-parametric approach derives crisp logical rules for each query by finding multiple textitgraph path patterns that connect similar source entities through the given relation.
arXiv Detail & Related papers (2020-06-25T06:28:09Z) - Evaluating Logical Generalization in Graph Neural Networks [59.70452462833374]
We study the task of logical generalization using graph neural networks (GNNs)
Our benchmark suite, GraphLog, requires that learning algorithms perform rule induction in different synthetic logics.
We find that the ability for models to generalize and adapt is strongly determined by the diversity of the logical rules they encounter during training.
arXiv Detail & Related papers (2020-03-14T05:45:55Z) - Relational Message Passing for Knowledge Graph Completion [78.47976646383222]
We propose a relational message passing method for knowledge graph completion.
It passes relational messages among edges iteratively to aggregate neighborhood information.
Results show our method outperforms stateof-the-art knowledge completion methods by a large margin.
arXiv Detail & Related papers (2020-02-17T03:33:41Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.