A Survey on Retrieval-Augmented Text Generation
- URL: http://arxiv.org/abs/2202.01110v1
- Date: Wed, 2 Feb 2022 16:18:41 GMT
- Title: A Survey on Retrieval-Augmented Text Generation
- Authors: Huayang Li and Yixuan Su and Deng Cai and Yan Wang and Lemao Liu
- Abstract summary: Retrieval-augmented text generation has remarkable advantages and has achieved state-of-the-art performance in many NLP tasks.
It firstly highlights the generic paradigm of retrieval-augmented generation, and then it reviews notable approaches according to different tasks.
- Score: 53.04991859796971
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Recently, retrieval-augmented text generation attracted increasing attention
of the computational linguistics community. Compared with conventional
generation models, retrieval-augmented text generation has remarkable
advantages and particularly has achieved state-of-the-art performance in many
NLP tasks. This paper aims to conduct a survey about retrieval-augmented text
generation. It firstly highlights the generic paradigm of retrieval-augmented
generation, and then it reviews notable approaches according to different tasks
including dialogue response generation, machine translation, and other
generation tasks. Finally, it points out some important directions on top of
recent methods to facilitate future research.
Related papers
- Analysis of Plan-based Retrieval for Grounded Text Generation [78.89478272104739]
hallucinations occur when a language model is given a generation task outside its parametric knowledge.
A common strategy to address this limitation is to infuse the language models with retrieval mechanisms.
We analyze how planning can be used to guide retrieval to further reduce the frequency of hallucinations.
arXiv Detail & Related papers (2024-08-20T02:19:35Z) - Enhancing Retrieval-Augmented Large Language Models with Iterative
Retrieval-Generation Synergy [164.83371924650294]
We show that strong performance can be achieved by a method we call Iter-RetGen, which synergizes retrieval and generation in an iterative manner.
A model output shows what might be needed to finish a task, and thus provides an informative context for retrieving more relevant knowledge.
Iter-RetGen processes all retrieved knowledge as a whole and largely preserves the flexibility in generation without structural constraints.
arXiv Detail & Related papers (2023-05-24T16:17:36Z) - A survey on text generation using generative adversarial networks [0.0]
This work presents a thorough review concerning recent studies and text generation advancements using Generative Adversarial Networks.
The usage of adversarial learning for text generation is promising as it provides alternatives to generate the so-called "natural" language.
arXiv Detail & Related papers (2022-12-20T17:54:08Z) - Survey of Hallucination in Natural Language Generation [69.9926849848132]
Natural Language Generation (NLG) has improved exponentially in recent years thanks to the development of sequence-to-sequence deep learning technologies.
Deep learning based generation is prone to hallucinate unintended text, which degrades the system performance.
This survey serves to facilitate collaborative efforts among researchers in tackling the challenge of hallucinated texts in NLG.
arXiv Detail & Related papers (2022-02-08T03:55:01Z) - Pretrained Language Models for Text Generation: A Survey [46.03096493973206]
We present an overview of the major advances achieved in the topic of pretrained language models (PLMs) for text generation.
We discuss how to adapt existing PLMs to model different input data and satisfy special properties in the generated text.
arXiv Detail & Related papers (2021-05-21T12:27:44Z) - Positioning yourself in the maze of Neural Text Generation: A
Task-Agnostic Survey [54.34370423151014]
This paper surveys the components of modeling approaches relaying task impacts across various generation tasks such as storytelling, summarization, translation etc.
We present an abstraction of the imperative techniques with respect to learning paradigms, pretraining, modeling approaches, decoding and the key challenges outstanding in the field in each of them.
arXiv Detail & Related papers (2020-10-14T17:54:42Z) - Unsupervised Text Generation by Learning from Search [86.51619839836331]
TGLS is a novel framework to unsupervised Text Generation by Learning.
We demonstrate the effectiveness of TGLS on two real-world natural language generation tasks, paraphrase generation and text formalization.
arXiv Detail & Related papers (2020-07-09T04:34:48Z) - QURIOUS: Question Generation Pretraining for Text Generation [13.595014409069584]
We propose question generation as a pretraining method, which better aligns with the text generation objectives.
Our text generation models pretrained with this method are better at understanding the essence of the input and are better language models for the target task.
arXiv Detail & Related papers (2020-04-23T08:41:52Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.