Knowledge Graph-Augmented Language Models for Knowledge-Grounded
Dialogue Generation
- URL: http://arxiv.org/abs/2305.18846v1
- Date: Tue, 30 May 2023 08:36:45 GMT
- Title: Knowledge Graph-Augmented Language Models for Knowledge-Grounded
Dialogue Generation
- Authors: Minki Kang, Jin Myung Kwak, Jinheon Baek, Sung Ju Hwang
- Abstract summary: We propose SUbgraph Retrieval-augmented GEneration (SURGE), a framework for generating context-relevant and knowledge-grounded dialogues with Knowledge Graphs (KGs)
Our framework first retrieves the relevant subgraph from the KG, and then enforces consistency across facts by perturbing their word embeddings conditioned by the retrieved subgraph.
We validate our SURGE framework on OpendialKG and KOMODIS datasets, showing that it generates high-quality dialogues that faithfully reflect the knowledge from KG.
- Score: 58.65698688443091
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Language models have achieved impressive performances on dialogue generation
tasks. However, when generating responses for a conversation that requires
factual knowledge, they are far from perfect, due to an absence of mechanisms
to retrieve, encode, and reflect the knowledge in the generated responses. Some
knowledge-grounded dialogue generation methods tackle this problem by
leveraging facts from Knowledge Graphs (KGs); however, they do not guarantee
that the model utilizes a relevant piece of knowledge from the KG. To overcome
this limitation, we propose SUbgraph Retrieval-augmented GEneration (SURGE), a
framework for generating context-relevant and knowledge-grounded dialogues with
the KG. Specifically, our SURGE framework first retrieves the relevant subgraph
from the KG, and then enforces consistency across facts by perturbing their
word embeddings conditioned by the retrieved subgraph. Then, we utilize
contrastive learning to ensure that the generated texts have high similarity to
the retrieved subgraphs. We validate our SURGE framework on OpendialKG and
KOMODIS datasets, showing that it generates high-quality dialogues that
faithfully reflect the knowledge from KG.
Related papers
- Improving the Robustness of Knowledge-Grounded Dialogue via Contrastive
Learning [71.8876256714229]
We propose an entity-based contrastive learning framework for improving the robustness of knowledge-grounded dialogue systems.
Our method achieves new state-of-the-art performance in terms of automatic evaluation scores.
arXiv Detail & Related papers (2024-01-09T05:16:52Z) - Knowledge Graphs and Pre-trained Language Models enhanced Representation Learning for Conversational Recommender Systems [58.561904356651276]
We introduce the Knowledge-Enhanced Entity Representation Learning (KERL) framework to improve the semantic understanding of entities for Conversational recommender systems.
KERL uses a knowledge graph and a pre-trained language model to improve the semantic understanding of entities.
KERL achieves state-of-the-art results in both recommendation and response generation tasks.
arXiv Detail & Related papers (2023-12-18T06:41:23Z) - Variational Reasoning over Incomplete Knowledge Graphs for
Conversational Recommendation [48.70062671767362]
We propose the Variational Reasoning over Incomplete KGs Conversational Recommender (VRICR)
Our key idea is to incorporate the large dialogue corpus naturally accompanied with CRSs to enhance the incomplete KGs.
We also denote the dialogue-specific subgraphs of KGs as latent variables with categorical priors for adaptive knowledge graphs.
arXiv Detail & Related papers (2022-12-22T17:02:21Z) - RHO ($\rho$): Reducing Hallucination in Open-domain Dialogues with
Knowledge Grounding [57.46495388734495]
This paper presents RHO ($rho$) utilizing the representations of linked entities and relation predicates from a knowledge graph (KG)
We propose (1) local knowledge grounding to combine textual embeddings with the corresponding KG embeddings; and (2) global knowledge grounding to equip RHO with multi-hop reasoning abilities via the attention mechanism.
arXiv Detail & Related papers (2022-12-03T10:36:34Z) - KELM: Knowledge Enhanced Pre-Trained Language Representations with
Message Passing on Hierarchical Relational Graphs [26.557447199727758]
We propose a novel knowledge-aware language model framework based on fine-tuning process.
Our model can efficiently incorporate world knowledge from KGs into existing language models such as BERT.
arXiv Detail & Related papers (2021-09-09T12:39:17Z) - Grounding Dialogue Systems via Knowledge Graph Aware Decoding with
Pre-trained Transformers [3.477557431978457]
Knowledge Graphs can potentially facilitate a dialogue system to produce knowledge grounded responses.
This paper proposes a novel architecture for integrating KGs into the response generation process by training a BERT model.
The k-hop subgraph of the KG is incorporated into the model during training and inference using Graph Laplacian.
arXiv Detail & Related papers (2021-03-30T12:36:00Z) - KG-BART: Knowledge Graph-Augmented BART for Generative Commonsense
Reasoning [78.81080813406177]
We propose a novel knowledge graph augmented pre-trained language generation model KG-BART.
KG-BART encompasses the complex relations of concepts through the knowledge graph and produces more logical and natural sentences as output.
arXiv Detail & Related papers (2020-09-26T19:57:49Z) - Knowledge-graph based Proactive Dialogue Generation with Improved
Meta-Learning [0.0]
We propose a knowledge graph based proactive dialogue generation model (KgDg) with three components.
For knowledge triplets embedding and selection, we formulate it as a problem of sentence embedding to better capture semantic information.
Our improved MAML algorithm is capable of learning general features from a limited number of knowledge graphs.
arXiv Detail & Related papers (2020-04-19T08:41:12Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.