From the One, Judge of the Whole: Typed Entailment Graph Construction
with Predicate Generation
- URL: http://arxiv.org/abs/2306.04170v1
- Date: Wed, 7 Jun 2023 05:46:19 GMT
- Title: From the One, Judge of the Whole: Typed Entailment Graph Construction
with Predicate Generation
- Authors: Zhibin Chen, Yansong Feng, Dongyan Zhao
- Abstract summary: Entailment Graphs (EGs) are constructed to indicate context-independent entailment relations in natural languages.
In this paper, we propose a multi-stage method, Typed Predicate-Entailment Graph Generator (TP-EGG) to tackle this problem.
Experiments on benchmark datasets show that TP-EGG can generate high-quality and scale-controllable entailment graphs.
- Score: 69.91691115264132
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Entailment Graphs (EGs) have been constructed based on extracted corpora as a
strong and explainable form to indicate context-independent entailment
relations in natural languages. However, EGs built by previous methods often
suffer from the severe sparsity issues, due to limited corpora available and
the long-tail phenomenon of predicate distributions. In this paper, we propose
a multi-stage method, Typed Predicate-Entailment Graph Generator (TP-EGG), to
tackle this problem. Given several seed predicates, TP-EGG builds the graphs by
generating new predicates and detecting entailment relations among them. The
generative nature of TP-EGG helps us leverage the recent advances from large
pretrained language models (PLMs), while avoiding the reliance on carefully
prepared corpora. Experiments on benchmark datasets show that TP-EGG can
generate high-quality and scale-controllable entailment graphs, achieving
significant in-domain improvement over state-of-the-art EGs and boosting the
performance of down-stream inference tasks.
Related papers
- A Pure Transformer Pretraining Framework on Text-attributed Graphs [50.833130854272774]
We introduce a feature-centric pretraining perspective by treating graph structure as a prior.
Our framework, Graph Sequence Pretraining with Transformer (GSPT), samples node contexts through random walks.
GSPT can be easily adapted to both node classification and link prediction, demonstrating promising empirical success on various datasets.
arXiv Detail & Related papers (2024-06-19T22:30:08Z) - Robust Stochastic Graph Generator for Counterfactual Explanations [8.82587501822953]
Graph Counterfactual Explanation (GCE) techniques have garnered attention as a means to provide insights to users engaging with AI systems.
GCEs generate a new graph similar to the original one, with a different outcome grounded on the underlying predictive model.
Among these GCE techniques, those rooted in generative mechanisms have received relatively limited investigation despite impressive accomplishments in other domains.
arXiv Detail & Related papers (2023-12-18T23:16:28Z) - Graph-level Representation Learning with Joint-Embedding Predictive Architectures [43.89120279424267]
Joint-Embedding Predictive Architectures (JEPAs) have emerged as a novel and powerful technique for self-supervised representation learning.
We show that graph-level representations can be effectively modeled using this paradigm by proposing a Graph Joint-Embedding Predictive Architecture (Graph-JEPA)
In particular, we employ masked modeling and focus on predicting the latent representations of masked subgraphs starting from the latent representation of a context subgraph.
arXiv Detail & Related papers (2023-09-27T20:42:02Z) - SimTeG: A Frustratingly Simple Approach Improves Textual Graph Learning [131.04781590452308]
We present SimTeG, a frustratingly Simple approach for Textual Graph learning.
We first perform supervised parameter-efficient fine-tuning (PEFT) on a pre-trained LM on the downstream task.
We then generate node embeddings using the last hidden states of finetuned LM.
arXiv Detail & Related papers (2023-08-03T07:00:04Z) - Explanation Graph Generation via Generative Pre-training over Synthetic
Graphs [6.25568933262682]
The generation of explanation graphs is a significant task that aims to produce explanation graphs in response to user input.
Current research commonly fine-tunes a text-based pre-trained language model on a small downstream dataset that is annotated with labeled graphs.
We propose a novel pre-trained framework EG3P(for Explanation Graph Generation via Generative Pre-training over synthetic graphs) for the explanation graph generation task.
arXiv Detail & Related papers (2023-06-01T13:20:22Z) - SCGG: A Deep Structure-Conditioned Graph Generative Model [9.046174529859524]
A conditional deep graph generation method called SCGG considers a particular type of structural conditions.
The architecture of SCGG consists of a graph representation learning network and an autoregressive generative model, which is trained end-to-end.
Experimental results on both synthetic and real-world datasets demonstrate the superiority of our method compared with state-of-the-art baselines.
arXiv Detail & Related papers (2022-09-20T12:33:50Z) - Entailment Graph Learning with Textual Entailment and Soft Transitivity [69.91691115264132]
We propose a two-stage method, Entailment Graph with Textual Entailment and Transitivity (EGT2)
EGT2 learns local entailment relations by recognizing possible textual entailment between template sentences formed by CCG-parsed predicates.
Based on the generated local graph, EGT2 then uses three novel soft transitivity constraints to consider the logical transitivity in entailment structures.
arXiv Detail & Related papers (2022-04-07T08:33:06Z) - A Robust and Generalized Framework for Adversarial Graph Embedding [73.37228022428663]
We propose a robust framework for adversarial graph embedding, named AGE.
AGE generates the fake neighbor nodes as the enhanced negative samples from the implicit distribution.
Based on this framework, we propose three models to handle three types of graph data.
arXiv Detail & Related papers (2021-05-22T07:05:48Z) - Block-Approximated Exponential Random Graphs [77.4792558024487]
An important challenge in the field of exponential random graphs (ERGs) is the fitting of non-trivial ERGs on large graphs.
We propose an approximative framework to such non-trivial ERGs that result in dyadic independence (i.e., edge independent) distributions.
Our methods are scalable to sparse graphs consisting of millions of nodes.
arXiv Detail & Related papers (2020-02-14T11:42:16Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.