A survey on text generation using generative adversarial networks
- URL: http://arxiv.org/abs/2212.11119v1
- Date: Tue, 20 Dec 2022 17:54:08 GMT
- Title: A survey on text generation using generative adversarial networks
- Authors: Gustavo Henrique de Rosa, Jo\~ao Paulo Papa
- Abstract summary: This work presents a thorough review concerning recent studies and text generation advancements using Generative Adversarial Networks.
The usage of adversarial learning for text generation is promising as it provides alternatives to generate the so-called "natural" language.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: This work presents a thorough review concerning recent studies and text
generation advancements using Generative Adversarial Networks. The usage of
adversarial learning for text generation is promising as it provides
alternatives to generate the so-called "natural" language. Nevertheless,
adversarial text generation is not a simple task as its foremost architecture,
the Generative Adversarial Networks, were designed to cope with continuous
information (image) instead of discrete data (text). Thus, most works are based
on three possible options, i.e., Gumbel-Softmax differentiation, Reinforcement
Learning, and modified training objectives. All alternatives are reviewed in
this survey as they present the most recent approaches for generating text
using adversarial-based techniques. The selected works were taken from renowned
databases, such as Science Direct, IEEEXplore, Springer, Association for
Computing Machinery, and arXiv, whereas each selected work has been critically
analyzed and assessed to present its objective, methodology, and experimental
results.
Related papers
- DeTeCtive: Detecting AI-generated Text via Multi-Level Contrastive Learning [24.99797253885887]
We argue that the key to accomplishing this task lies in distinguishing writing styles of different authors.
We propose DeTeCtive, a multi-task auxiliary, multi-level contrastive learning framework.
Our method is compatible with a range of text encoders.
arXiv Detail & Related papers (2024-10-28T12:34:49Z) - CiteBench: A benchmark for Scientific Citation Text Generation [69.37571393032026]
CiteBench is a benchmark for citation text generation.
We make the code for CiteBench publicly available at https://github.com/UKPLab/citebench.
arXiv Detail & Related papers (2022-12-19T16:10:56Z) - On Decoding Strategies for Neural Text Generators [73.48162198041884]
We study the interaction between language generation tasks and decoding strategies.
We measure changes in attributes of generated text as a function of both decoding strategy and task.
Our results reveal both previously-observed and surprising findings.
arXiv Detail & Related papers (2022-03-29T16:25:30Z) - A Survey on Retrieval-Augmented Text Generation [53.04991859796971]
Retrieval-augmented text generation has remarkable advantages and has achieved state-of-the-art performance in many NLP tasks.
It firstly highlights the generic paradigm of retrieval-augmented generation, and then it reviews notable approaches according to different tasks.
arXiv Detail & Related papers (2022-02-02T16:18:41Z) - Positioning yourself in the maze of Neural Text Generation: A
Task-Agnostic Survey [54.34370423151014]
This paper surveys the components of modeling approaches relaying task impacts across various generation tasks such as storytelling, summarization, translation etc.
We present an abstraction of the imperative techniques with respect to learning paradigms, pretraining, modeling approaches, decoding and the key challenges outstanding in the field in each of them.
arXiv Detail & Related papers (2020-10-14T17:54:42Z) - A Survey of Knowledge-Enhanced Text Generation [81.24633231919137]
The goal of text generation is to make machines express in human language.
Various neural encoder-decoder models have been proposed to achieve the goal by learning to map input text to output text.
To address this issue, researchers have considered incorporating various forms of knowledge beyond the input text into the generation models.
arXiv Detail & Related papers (2020-10-09T06:46:46Z) - Unsupervised Text Generation by Learning from Search [86.51619839836331]
TGLS is a novel framework to unsupervised Text Generation by Learning.
We demonstrate the effectiveness of TGLS on two real-world natural language generation tasks, paraphrase generation and text formalization.
arXiv Detail & Related papers (2020-07-09T04:34:48Z) - Efficient text generation of user-defined topic using generative
adversarial networks [0.32228025627337864]
We propose a User-Defined GAN (UD-GAN) with two-level discriminators to solve this problem.
The proposed method is capable of generating texts with less time than others.
arXiv Detail & Related papers (2020-06-22T04:49:47Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.