KG-BART: Knowledge Graph-Augmented BART for Generative Commonsense
Reasoning
- URL: http://arxiv.org/abs/2009.12677v2
- Date: Thu, 21 Jan 2021 06:02:59 GMT
- Title: KG-BART: Knowledge Graph-Augmented BART for Generative Commonsense
Reasoning
- Authors: Ye Liu, Yao Wan, Lifang He, Hao Peng, Philip S. Yu
- Abstract summary: We propose a novel knowledge graph augmented pre-trained language generation model KG-BART.
KG-BART encompasses the complex relations of concepts through the knowledge graph and produces more logical and natural sentences as output.
- Score: 78.81080813406177
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Generative commonsense reasoning which aims to empower machines to generate
sentences with the capacity of reasoning over a set of concepts is a critical
bottleneck for text generation. Even the state-of-the-art pre-trained language
generation models struggle at this task and often produce implausible and
anomalous sentences. One reason is that they rarely consider incorporating the
knowledge graph which can provide rich relational information among the
commonsense concepts. To promote the ability of commonsense reasoning for text
generation, we propose a novel knowledge graph augmented pre-trained language
generation model KG-BART, which encompasses the complex relations of concepts
through the knowledge graph and produces more logical and natural sentences as
output. Moreover, KG-BART can leverage the graph attention to aggregate the
rich concept semantics that enhances the model generalization on unseen concept
sets. Experiments on benchmark CommonGen dataset verify the effectiveness of
our proposed approach by comparing with several strong pre-trained language
generation models, particularly KG-BART outperforms BART by 5.80, 4.60, in
terms of BLEU-3, 4. Moreover, we also show that the generated context by our
model can work as background scenarios to benefit downstream commonsense QA
tasks.
Related papers
- A Prompt-Based Knowledge Graph Foundation Model for Universal In-Context Reasoning [17.676185326247946]
We propose a prompt-based KG foundation model via in-context learning, namely KG-ICL, to achieve a universal reasoning ability.
To encode prompt graphs with the generalization ability to unseen entities and relations in queries, we first propose a unified tokenizer.
Then, we propose two message passing neural networks to perform prompt encoding and KG reasoning, respectively.
arXiv Detail & Related papers (2024-10-16T06:47:18Z) - Knowledge Graph-Augmented Language Models for Knowledge-Grounded
Dialogue Generation [58.65698688443091]
We propose SUbgraph Retrieval-augmented GEneration (SURGE), a framework for generating context-relevant and knowledge-grounded dialogues with Knowledge Graphs (KGs)
Our framework first retrieves the relevant subgraph from the KG, and then enforces consistency across facts by perturbing their word embeddings conditioned by the retrieved subgraph.
We validate our SURGE framework on OpendialKG and KOMODIS datasets, showing that it generates high-quality dialogues that faithfully reflect the knowledge from KG.
arXiv Detail & Related papers (2023-05-30T08:36:45Z) - CADGE: Context-Aware Dialogue Generation Enhanced with Graph-Structured Knowledge Aggregation [25.56539617837482]
A novel context-aware graph-attention model (Context-aware GAT) is proposed.
It assimilates global features from relevant knowledge graphs through a context-enhanced knowledge aggregation mechanism.
Empirical results demonstrate that our framework outperforms conventional GNN-based language models in terms of performance.
arXiv Detail & Related papers (2023-05-10T16:31:35Z) - Contextualized Scene Imagination for Generative Commonsense Reasoning [35.03682416576795]
generative commonsense reasoning skills are lacking in state-of-the-art text generation methods.
We propose an Imagine-and-Verbalize (I&V) method, which learns to imagine a relational scene knowledge graph.
Experiments demonstrate the effectiveness of I&V in improving language models on both concept-to-sentence and concept-to-story generation tasks.
arXiv Detail & Related papers (2021-12-12T20:38:08Z) - Generated Knowledge Prompting for Commonsense Reasoning [53.88983683513114]
We propose generating knowledge statements directly from a language model with a generic prompt format.
This approach improves performance of both off-the-shelf and finetuned language models on four commonsense reasoning tasks.
Notably, we find that a model's predictions can improve when using its own generated knowledge.
arXiv Detail & Related papers (2021-10-15T21:58:03Z) - KELM: Knowledge Enhanced Pre-Trained Language Representations with
Message Passing on Hierarchical Relational Graphs [26.557447199727758]
We propose a novel knowledge-aware language model framework based on fine-tuning process.
Our model can efficiently incorporate world knowledge from KGs into existing language models such as BERT.
arXiv Detail & Related papers (2021-09-09T12:39:17Z) - Language Generation with Multi-Hop Reasoning on Commonsense Knowledge
Graph [124.45799297285083]
We argue that exploiting both the structural and semantic information of the knowledge graph facilitates commonsense-aware text generation.
We propose Generation with Multi-Hop Reasoning Flow (GRF) that enables pre-trained models with dynamic multi-hop reasoning on multi-relational paths extracted from the external commonsense knowledge graph.
arXiv Detail & Related papers (2020-09-24T13:55:32Z) - Exploiting Structured Knowledge in Text via Graph-Guided Representation
Learning [73.0598186896953]
We present two self-supervised tasks learning over raw text with the guidance from knowledge graphs.
Building upon entity-level masked language models, our first contribution is an entity masking scheme.
In contrast to existing paradigms, our approach uses knowledge graphs implicitly, only during pre-training.
arXiv Detail & Related papers (2020-04-29T14:22:42Z) - On the Role of Conceptualization in Commonsense Knowledge Graph
Construction [59.39512925793171]
Commonsense knowledge graphs (CKGs) like Atomic and ASER are substantially different from conventional KGs.
We introduce to CKG construction methods conceptualization to view entities mentioned in text as instances of specific concepts or vice versa.
Our methods can effectively identify plausible triples and expand the KG by triples of both new nodes and edges of high diversity and novelty.
arXiv Detail & Related papers (2020-03-06T14:35:20Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.