Entailment Graph Learning with Textual Entailment and Soft Transitivity
- URL: http://arxiv.org/abs/2204.03286v1
- Date: Thu, 7 Apr 2022 08:33:06 GMT
- Title: Entailment Graph Learning with Textual Entailment and Soft Transitivity
- Authors: Zhibin Chen, Yansong Feng and Dongyan Zhao
- Abstract summary: We propose a two-stage method, Entailment Graph with Textual Entailment and Transitivity (EGT2)
EGT2 learns local entailment relations by recognizing possible textual entailment between template sentences formed by CCG-parsed predicates.
Based on the generated local graph, EGT2 then uses three novel soft transitivity constraints to consider the logical transitivity in entailment structures.
- Score: 69.91691115264132
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Typed entailment graphs try to learn the entailment relations between
predicates from text and model them as edges between predicate nodes. The
construction of entailment graphs usually suffers from severe sparsity and
unreliability of distributional similarity. We propose a two-stage method,
Entailment Graph with Textual Entailment and Transitivity (EGT2). EGT2 learns
local entailment relations by recognizing possible textual entailment between
template sentences formed by typed CCG-parsed predicates. Based on the
generated local graph, EGT2 then uses three novel soft transitivity constraints
to consider the logical transitivity in entailment structures. Experiments on
benchmark datasets show that EGT2 can well model the transitivity in entailment
graph to alleviate the sparsity issue, and lead to significant improvement over
current state-of-the-art methods.
Related papers
- EntailE: Introducing Textual Entailment in Commonsense Knowledge Graph
Completion [54.12709176438264]
Commonsense knowledge graphs (CSKGs) utilize free-form text to represent named entities, short phrases, and events as their nodes.
Current methods leverage semantic similarities to increase the graph density, but the semantic plausibility of the nodes and their relations are under-explored.
We propose to adopt textual entailment to find implicit entailment relations between CSKG nodes, to effectively densify the subgraph connecting nodes within the same conceptual class.
arXiv Detail & Related papers (2024-02-15T02:27:23Z) - From the One, Judge of the Whole: Typed Entailment Graph Construction
with Predicate Generation [69.91691115264132]
Entailment Graphs (EGs) are constructed to indicate context-independent entailment relations in natural languages.
In this paper, we propose a multi-stage method, Typed Predicate-Entailment Graph Generator (TP-EGG) to tackle this problem.
Experiments on benchmark datasets show that TP-EGG can generate high-quality and scale-controllable entailment graphs.
arXiv Detail & Related papers (2023-06-07T05:46:19Z) - Explanation Graph Generation via Generative Pre-training over Synthetic
Graphs [6.25568933262682]
The generation of explanation graphs is a significant task that aims to produce explanation graphs in response to user input.
Current research commonly fine-tunes a text-based pre-trained language model on a small downstream dataset that is annotated with labeled graphs.
We propose a novel pre-trained framework EG3P(for Explanation Graph Generation via Generative Pre-training over synthetic graphs) for the explanation graph generation task.
arXiv Detail & Related papers (2023-06-01T13:20:22Z) - You Only Transfer What You Share: Intersection-Induced Graph Transfer
Learning for Link Prediction [79.15394378571132]
We investigate a previously overlooked phenomenon: in many cases, a densely connected, complementary graph can be found for the original graph.
The denser graph may share nodes with the original graph, which offers a natural bridge for transferring selective, meaningful knowledge.
We identify this setting as Graph Intersection-induced Transfer Learning (GITL), which is motivated by practical applications in e-commerce or academic co-authorship predictions.
arXiv Detail & Related papers (2023-02-27T22:56:06Z) - GrannGAN: Graph annotation generative adversarial networks [72.66289932625742]
We consider the problem of modelling high-dimensional distributions and generating new examples of data with complex relational feature structure coherent with a graph skeleton.
The model we propose tackles the problem of generating the data features constrained by the specific graph structure of each data point by splitting the task into two phases.
In the first it models the distribution of features associated with the nodes of the given graph, in the second it complements the edge features conditionally on the node features.
arXiv Detail & Related papers (2022-12-01T11:49:07Z) - GraphDCA -- a Framework for Node Distribution Comparison in Real and
Synthetic Graphs [72.51835626235368]
We argue that when comparing two graphs, the distribution of node structural features is more informative than global graph statistics.
We present GraphDCA - a framework for evaluating similarity between graphs based on the alignment of their respective node representation sets.
arXiv Detail & Related papers (2022-02-08T14:19:19Z) - JointGT: Graph-Text Joint Representation Learning for Text Generation
from Knowledge Graphs [44.06715423776722]
We propose a graph-text joint representation learning model called JointGT.
During encoding, we devise a structure-aware semantic aggregation module which is plugged into each Transformer layer.
We show that JointGT obtains new state-of-the-art performance on various KG-to-text datasets.
arXiv Detail & Related papers (2021-06-19T14:10:10Z) - GTN-ED: Event Detection Using Graph Transformer Networks [12.96137943176861]
We propose a novel framework for incorporating both dependencies and their labels using a recently proposed technique called Graph Transformer Networks (GTN)
We integrate GTNs to leverage dependency relations on two existing homogeneous-graph-based models, and demonstrate an improvement in the F1 score on the ACE dataset.
arXiv Detail & Related papers (2021-04-30T16:35:29Z) - Inducing Alignment Structure with Gated Graph Attention Networks for
Sentence Matching [24.02847802702168]
This paper proposes a graph-based approach for sentence matching.
We represent a sentence pair as a graph with several carefully design strategies.
We then employ a novel gated graph attention network to encode the constructed graph for sentence matching.
arXiv Detail & Related papers (2020-10-15T11:25:54Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.