Grounding Dialogue Systems via Knowledge Graph Aware Decoding with
Pre-trained Transformers
- URL: http://arxiv.org/abs/2103.16289v1
- Date: Tue, 30 Mar 2021 12:36:00 GMT
- Title: Grounding Dialogue Systems via Knowledge Graph Aware Decoding with
Pre-trained Transformers
- Authors: Debanjan Chaudhuri, Md Rashad Al Hasan Rony, Jens Lehmann
- Abstract summary: Knowledge Graphs can potentially facilitate a dialogue system to produce knowledge grounded responses.
This paper proposes a novel architecture for integrating KGs into the response generation process by training a BERT model.
The k-hop subgraph of the KG is incorporated into the model during training and inference using Graph Laplacian.
- Score: 3.477557431978457
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Generating knowledge grounded responses in both goal and non-goal oriented
dialogue systems is an important research challenge. Knowledge Graphs (KG) can
be viewed as an abstraction of the real world, which can potentially facilitate
a dialogue system to produce knowledge grounded responses. However, integrating
KGs into the dialogue generation process in an end-to-end manner is a
non-trivial task. This paper proposes a novel architecture for integrating KGs
into the response generation process by training a BERT model that learns to
answer using the elements of the KG (entities and relations) in a multi-task,
end-to-end setting. The k-hop subgraph of the KG is incorporated into the model
during training and inference using Graph Laplacian. Empirical evaluation
suggests that the model achieves better knowledge groundedness (measured via
Entity F1 score) compared to other state-of-the-art models for both goal and
non-goal oriented dialogues.
Related papers
- Improving the Robustness of Knowledge-Grounded Dialogue via Contrastive
Learning [71.8876256714229]
We propose an entity-based contrastive learning framework for improving the robustness of knowledge-grounded dialogue systems.
Our method achieves new state-of-the-art performance in terms of automatic evaluation scores.
arXiv Detail & Related papers (2024-01-09T05:16:52Z) - Knowledge Graph-Augmented Language Models for Knowledge-Grounded
Dialogue Generation [58.65698688443091]
We propose SUbgraph Retrieval-augmented GEneration (SURGE), a framework for generating context-relevant and knowledge-grounded dialogues with Knowledge Graphs (KGs)
Our framework first retrieves the relevant subgraph from the KG, and then enforces consistency across facts by perturbing their word embeddings conditioned by the retrieved subgraph.
We validate our SURGE framework on OpendialKG and KOMODIS datasets, showing that it generates high-quality dialogues that faithfully reflect the knowledge from KG.
arXiv Detail & Related papers (2023-05-30T08:36:45Z) - PK-Chat: Pointer Network Guided Knowledge Driven Generative Dialogue
Model [79.64376762489164]
PK-Chat is a Pointer network guided generative dialogue model, incorporating a unified pretrained language model and a pointer network over knowledge graphs.
The words generated by PK-Chat in the dialogue are derived from the prediction of word lists and the direct prediction of the external knowledge graph knowledge.
Based on the PK-Chat, a dialogue system is built for academic scenarios in the case of geosciences.
arXiv Detail & Related papers (2023-04-02T18:23:13Z) - RHO ($\rho$): Reducing Hallucination in Open-domain Dialogues with
Knowledge Grounding [57.46495388734495]
This paper presents RHO ($rho$) utilizing the representations of linked entities and relation predicates from a knowledge graph (KG)
We propose (1) local knowledge grounding to combine textual embeddings with the corresponding KG embeddings; and (2) global knowledge grounding to equip RHO with multi-hop reasoning abilities via the attention mechanism.
arXiv Detail & Related papers (2022-12-03T10:36:34Z) - Building Knowledge-Grounded Dialogue Systems with Graph-Based Semantic Modeling [43.0554223015728]
The knowledge-grounded dialogue task aims to generate responses that convey information from given knowledge documents.
We propose a novel graph structure, Grounded Graph, that models the semantic structure of both dialogue and knowledge.
We also propose a Grounded Graph Aware Transformer to enhance knowledge-grounded response generation.
arXiv Detail & Related papers (2022-04-27T03:31:46Z) - Towards Large-Scale Interpretable Knowledge Graph Reasoning for Dialogue
Systems [109.16553492049441]
We propose a novel method to incorporate the knowledge reasoning capability into dialogue systems in a more scalable and generalizable manner.
To the best of our knowledge, this is the first work to have transformer models generate responses by reasoning over differentiable knowledge graphs.
arXiv Detail & Related papers (2022-03-20T17:51:49Z) - KELM: Knowledge Enhanced Pre-Trained Language Representations with
Message Passing on Hierarchical Relational Graphs [26.557447199727758]
We propose a novel knowledge-aware language model framework based on fine-tuning process.
Our model can efficiently incorporate world knowledge from KGs into existing language models such as BERT.
arXiv Detail & Related papers (2021-09-09T12:39:17Z) - GRADE: Automatic Graph-Enhanced Coherence Metric for Evaluating
Open-Domain Dialogue Systems [133.13117064357425]
We propose a new evaluation metric GRADE, which stands for Graph-enhanced Representations for Automatic Dialogue Evaluation.
Specifically, GRADE incorporates both coarse-grained utterance-level contextualized representations and fine-grained topic-level graph representations to evaluate dialogue coherence.
Experimental results show that our GRADE significantly outperforms other state-of-the-art metrics on measuring diverse dialogue models.
arXiv Detail & Related papers (2020-10-08T14:07:32Z) - GraphDialog: Integrating Graph Knowledge into End-to-End Task-Oriented
Dialogue Systems [9.560436630775762]
End-to-end task-oriented dialogue systems aim to generate system responses directly from plain text inputs.
One is how to effectively incorporate external knowledge bases (KBs) into the learning framework; the other is how to accurately capture the semantics of dialogue history.
We address these two challenges by exploiting the graph structural information in the knowledge base and in the dependency parsing tree of the dialogue.
arXiv Detail & Related papers (2020-10-04T00:04:40Z) - Incorporating Joint Embeddings into Goal-Oriented Dialogues with
Multi-Task Learning [8.662586355051014]
We propose an RNN-based end-to-end encoder-decoder architecture which is trained with joint embeddings of the knowledge graph and the corpus as input.
The model provides an additional integration of user intent along with text generation, trained with a multi-task learning paradigm.
arXiv Detail & Related papers (2020-01-28T17:15:02Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.