Response-Anticipated Memory for On-Demand Knowledge Integration in
Response Generation
- URL: http://arxiv.org/abs/2005.06128v1
- Date: Wed, 13 May 2020 03:09:58 GMT
- Title: Response-Anticipated Memory for On-Demand Knowledge Integration in
Response Generation
- Authors: Zhiliang Tian, Wei Bi, Dongkyu Lee, Lanqing Xue, Yiping Song,
Xiaojiang Liu, Nevin L. Zhang
- Abstract summary: We propose to create a document memory with anticipated responses in mind.
This is achieved using a teacher-student framework.
Our model outperforms the previous state-of-the-art for the CbR task.
- Score: 45.53516539705227
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Neural conversation models are known to generate appropriate but
non-informative responses in general. A scenario where informativeness can be
significantly enhanced is Conversing by Reading (CbR), where conversations take
place with respect to a given external document. In previous work, the external
document is utilized by (1) creating a context-aware document memory that
integrates information from the document and the conversational context, and
then (2) generating responses referring to the memory. In this paper, we
propose to create the document memory with some anticipated responses in mind.
This is achieved using a teacher-student framework. The teacher is given the
external document, the context, and the ground-truth response, and learns how
to build a response-aware document memory from three sources of information.
The student learns to construct a response-anticipated document memory from the
first two sources, and the teacher's insight on memory creation. Empirical
results show that our model outperforms the previous state-of-the-art for the
CbR task.
Related papers
- Memorizing Documents with Guidance in Large Language Models [21.919661430250798]
We propose document-wise memory architecture to track document memories in training.
We show that the proposed methods provide different memory entries for documents and high recall of document-related content in generation with trained document-wise memories.
arXiv Detail & Related papers (2024-06-23T03:12:03Z) - Awakening Augmented Generation: Learning to Awaken Internal Knowledge of Large Language Models for Question Answering [30.409828862670764]
A novel knowledge-augmented framework, $textbfAwakening-Augmented-Generation$ (AAG), is proposed.
Explicit awakening fine-tunes a context generator to create a synthetic, compressed document that functions as symbolic context.
Implicit awakening utilizes a hypernetwork to generate adapters based on the question and synthetic document, which are inserted into Large Language Models.
arXiv Detail & Related papers (2024-03-22T15:06:45Z) - UniMC: A Unified Framework for Long-Term Memory Conversation via
Relevance Representation Learning [15.313416157905685]
We propose a Unified framework for Long-term Memory Conversations (UniMC)
We decompose the main task into three subtasks based on probability graphs.
Each subtask involves learning a representation for calculating the relevance between the query and memory.
arXiv Detail & Related papers (2023-06-18T12:30:50Z) - Lift Yourself Up: Retrieval-augmented Text Generation with Self Memory [72.36736686941671]
We propose a novel framework, selfmem, for improving retrieval-augmented generation models.
Selfmem iteratively employs a retrieval-augmented generator to create an unbounded memory pool and using a memory selector to choose one output as memory for the subsequent generation round.
We evaluate the effectiveness of selfmem on three distinct text generation tasks.
arXiv Detail & Related papers (2023-05-03T21:40:54Z) - Recitation-Augmented Language Models [85.30591349383849]
We show that RECITE is a powerful paradigm for knowledge-intensive NLP tasks.
Specifically, we show that by utilizing recitation as the intermediate step, a recite-and-answer scheme can achieve new state-of-the-art performance.
arXiv Detail & Related papers (2022-10-04T00:49:20Z) - Generate rather than Retrieve: Large Language Models are Strong Context
Generators [74.87021992611672]
We present a novel perspective for solving knowledge-intensive tasks by replacing document retrievers with large language model generators.
We call our method generate-then-read (GenRead), which first prompts a large language model to generate contextutal documents based on a given question, and then reads the generated documents to produce the final answer.
arXiv Detail & Related papers (2022-09-21T01:30:59Z) - Layout-Aware Information Extraction for Document-Grounded Dialogue:
Dataset, Method and Demonstration [75.47708732473586]
We propose a layout-aware document-level Information Extraction dataset, LIE, to facilitate the study of extracting both structural and semantic knowledge from visually rich documents.
LIE contains 62k annotations of three extraction tasks from 4,061 pages in product and official documents.
Empirical results show that layout is critical for VRD-based extraction, and system demonstration also verifies that the extracted knowledge can help locate the answers that users care about.
arXiv Detail & Related papers (2022-07-14T07:59:45Z) - Open-domain Dialogue Generation Grounded with Dynamic Multi-form
Knowledge Fusion [9.45662259790057]
This paper presents a new dialogue generation model, Dynamic Multi-form Knowledge Fusion based Open-domain Chatt-ing Machine (DMKCM)
DMKCM applies an indexed text (a virtual Knowledge Base) to locate relevant documents as 1st hop and then expands the content of the dialogue and its 1st hop using a commonsense knowledge graph to get apposite triples as 2nd hop.
Experimental results indicate the effectiveness of our method in terms of dialogue coherence and informativeness.
arXiv Detail & Related papers (2022-04-24T10:32:48Z) - Reasoning in Dialog: Improving Response Generation by Context Reading
Comprehension [49.92173751203827]
In multi-turn dialog, utterances do not always take the full form of sentences.
We propose to improve the response generation performance by examining the model's ability to answer a reading comprehension question.
arXiv Detail & Related papers (2020-12-14T10:58:01Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.