RHO ($\rho$): Reducing Hallucination in Open-domain Dialogues with
Knowledge Grounding
- URL: http://arxiv.org/abs/2212.01588v2
- Date: Fri, 12 May 2023 04:52:23 GMT
- Title: RHO ($\rho$): Reducing Hallucination in Open-domain Dialogues with
Knowledge Grounding
- Authors: Ziwei Ji, Zihan Liu, Nayeon Lee, Tiezheng Yu, Bryan Wilie, Min Zeng,
Pascale Fung
- Abstract summary: This paper presents RHO ($rho$) utilizing the representations of linked entities and relation predicates from a knowledge graph (KG)
We propose (1) local knowledge grounding to combine textual embeddings with the corresponding KG embeddings; and (2) global knowledge grounding to equip RHO with multi-hop reasoning abilities via the attention mechanism.
- Score: 57.46495388734495
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Dialogue systems can leverage large pre-trained language models and knowledge
to generate fluent and informative responses. However, these models are still
prone to produce hallucinated responses not supported by the input source,
which greatly hinders their application. The heterogeneity between external
knowledge and dialogue context challenges representation learning and source
integration, and further contributes to unfaithfulness. To handle this
challenge and generate more faithful responses, this paper presents RHO
($\rho$) utilizing the representations of linked entities and relation
predicates from a knowledge graph (KG). We propose (1) local knowledge
grounding to combine textual embeddings with the corresponding KG embeddings;
and (2) global knowledge grounding to equip RHO with multi-hop reasoning
abilities via the attention mechanism. In addition, we devise a response
re-ranking technique based on walks over KG sub-graphs for better
conversational reasoning. Experimental results on OpenDialKG show that our
approach significantly outperforms state-of-the-art methods on both automatic
and human evaluation by a large margin, especially in hallucination reduction
(17.54% in FeQA).
Related papers
- Improving the Robustness of Knowledge-Grounded Dialogue via Contrastive
Learning [71.8876256714229]
We propose an entity-based contrastive learning framework for improving the robustness of knowledge-grounded dialogue systems.
Our method achieves new state-of-the-art performance in terms of automatic evaluation scores.
arXiv Detail & Related papers (2024-01-09T05:16:52Z) - PICK: Polished & Informed Candidate Scoring for Knowledge-Grounded
Dialogue Systems [59.1250765143521]
Current knowledge-grounded dialogue systems often fail to align the generated responses with human-preferred qualities.
We propose Polished & Informed Candidate Scoring (PICK), a generation re-scoring framework.
We demonstrate the effectiveness of PICK in generating responses that are more faithful while keeping them relevant to the dialogue history.
arXiv Detail & Related papers (2023-09-19T08:27:09Z) - A Bipartite Graph is All We Need for Enhancing Emotional Reasoning with
Commonsense Knowledge [16.410940528107115]
We propose a Bipartite Heterogeneous Graph (BHG) method for enhancing emotional reasoning with commonsense knowledge.
BHG-based knowledge infusion can be directly generalized to multi-type and multi-grained knowledge sources.
arXiv Detail & Related papers (2023-08-09T09:09:17Z) - Knowledge Graph-Augmented Language Models for Knowledge-Grounded
Dialogue Generation [58.65698688443091]
We propose SUbgraph Retrieval-augmented GEneration (SURGE), a framework for generating context-relevant and knowledge-grounded dialogues with Knowledge Graphs (KGs)
Our framework first retrieves the relevant subgraph from the KG, and then enforces consistency across facts by perturbing their word embeddings conditioned by the retrieved subgraph.
We validate our SURGE framework on OpendialKG and KOMODIS datasets, showing that it generates high-quality dialogues that faithfully reflect the knowledge from KG.
arXiv Detail & Related papers (2023-05-30T08:36:45Z) - ComFact: A Benchmark for Linking Contextual Commonsense Knowledge [31.19689856957576]
We propose the new task of commonsense fact linking, where models are given contexts and trained to identify situationally-relevant commonsense knowledge from KGs.
Our novel benchmark, ComFact, contains 293k in-context relevance annotations for commonsense across four stylistically diverse datasets.
arXiv Detail & Related papers (2022-10-23T09:30:39Z) - Knowledge Graph Augmented Network Towards Multiview Representation
Learning for Aspect-based Sentiment Analysis [96.53859361560505]
We propose a knowledge graph augmented network (KGAN) to incorporate external knowledge with explicitly syntactic and contextual information.
KGAN captures the sentiment feature representations from multiple perspectives, i.e., context-, syntax- and knowledge-based.
Experiments on three popular ABSA benchmarks demonstrate the effectiveness and robustness of our KGAN.
arXiv Detail & Related papers (2022-01-13T08:25:53Z) - Neural Path Hunter: Reducing Hallucination in Dialogue Systems via Path
Grounding [15.62141731259161]
We focus on the task of improving the faithfulness of Neural Dialogue Systems to known facts supplied by a Knowledge Graph (KG)
We propose Neural Path Hunter which follows a generate-then-refine strategy whereby a generated response is amended using the k-hop subgraph of a KG.
Our proposed model can easily be applied to any dialogue generated responses without retraining the model.
arXiv Detail & Related papers (2021-04-17T05:23:44Z) - Retrieval Augmentation Reduces Hallucination in Conversation [49.35235945543833]
We explore the use of neural-retrieval-in-the-loop architectures for knowledge-grounded dialogue.
We show that our best models obtain state-of-the-art performance on two knowledge-grounded conversational tasks.
arXiv Detail & Related papers (2021-04-15T16:24:43Z) - Grounding Dialogue Systems via Knowledge Graph Aware Decoding with
Pre-trained Transformers [3.477557431978457]
Knowledge Graphs can potentially facilitate a dialogue system to produce knowledge grounded responses.
This paper proposes a novel architecture for integrating KGs into the response generation process by training a BERT model.
The k-hop subgraph of the KG is incorporated into the model during training and inference using Graph Laplacian.
arXiv Detail & Related papers (2021-03-30T12:36:00Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.