Improving Adversarial Text Generation by Modeling the Distant Future
- URL: http://arxiv.org/abs/2005.01279v1
- Date: Mon, 4 May 2020 05:45:13 GMT
- Title: Improving Adversarial Text Generation by Modeling the Distant Future
- Authors: Ruiyi Zhang, Changyou Chen, Zhe Gan, Wenlin Wang, Dinghan Shen, Guoyin
Wang, Zheng Wen, Lawrence Carin
- Abstract summary: We consider a text planning scheme and present a model-based imitation-learning approach to alleviate the aforementioned issues.
We propose a novel guider network to focus on the generative process over a longer horizon, which can assist next-word prediction and provide intermediate rewards for generator optimization.
- Score: 155.83051741029732
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Auto-regressive text generation models usually focus on local fluency, and
may cause inconsistent semantic meaning in long text generation. Further,
automatically generating words with similar semantics is challenging, and
hand-crafted linguistic rules are difficult to apply. We consider a text
planning scheme and present a model-based imitation-learning approach to
alleviate the aforementioned issues. Specifically, we propose a novel guider
network to focus on the generative process over a longer horizon, which can
assist next-word prediction and provide intermediate rewards for generator
optimization. Extensive experiments demonstrate that the proposed method leads
to improved performance.
Related papers
- Enhancing Text Generation in Joint NLG/NLU Learning Through Curriculum Learning, Semi-Supervised Training, and Advanced Optimization Techniques [0.0]
This research paper developed a novel approach to improve text generation in the context of joint Natural Language Generation (NLG) and Natural Language Understanding (NLU) learning.
The data is prepared by gathering and preprocessing annotated datasets, including cleaning, tokenization, stemming, and stop-word removal.
Transformer-based encoders and decoders, capturing long range dependencies and improving source-target sequence modelling.
Reinforcement learning with policy gradient techniques, semi-supervised training, improved attention mechanisms, and differentiable approximations are employed to fine-tune the models and handle complex linguistic tasks effectively.
arXiv Detail & Related papers (2024-10-17T12:43:49Z) - PLANNER: Generating Diversified Paragraph via Latent Language Diffusion Model [37.2192243883707]
We propose PLANNER, a model that combines latent semantic diffusion with autoregressive generation to generate fluent text.
Results on semantic generation, text completion and summarization show its effectiveness in generating high-quality long-form text.
arXiv Detail & Related papers (2023-06-05T01:36:39Z) - PLANET: Dynamic Content Planning in Autoregressive Transformers for
Long-form Text Generation [47.97523895218194]
We propose a novel generation framework leveraging autoregressive self-attention mechanism to conduct content planning and surface realization dynamically.
Our framework enriches the Transformer decoder with latent representations to maintain sentence-level semantic plans grounded by bag-of-words.
arXiv Detail & Related papers (2022-03-17T05:52:35Z) - Improving Text Auto-Completion with Next Phrase Prediction [9.385387026783103]
Our strategy includes a novel self-supervised training objective called Next Phrase Prediction (NPP)
Preliminary experiments have shown that our approach is able to outperform the baselines in auto-completion for email and academic writing domains.
arXiv Detail & Related papers (2021-09-15T04:26:15Z) - Reinforced Generative Adversarial Network for Abstractive Text
Summarization [7.507096634112164]
Sequence-to-sequence models provide a viable new approach to generative summarization.
These models have three drawbacks: their grasp of the details of the original text is often inaccurate, and the text generated by such models often has repetitions.
We propose a new architecture that combines reinforcement learning and adversarial generative networks to enhance the sequence-to-sequence attention model.
arXiv Detail & Related papers (2021-05-31T17:34:47Z) - Long Text Generation by Modeling Sentence-Level and Discourse-Level
Coherence [59.51720326054546]
We propose a long text generation model, which can represent the prefix sentences at sentence level and discourse level in the decoding process.
Our model can generate more coherent texts than state-of-the-art baselines.
arXiv Detail & Related papers (2021-05-19T07:29:08Z) - GTAE: Graph-Transformer based Auto-Encoders for Linguistic-Constrained
Text Style Transfer [119.70961704127157]
Non-parallel text style transfer has attracted increasing research interests in recent years.
Current approaches still lack the ability to preserve the content and even logic of original sentences.
We propose a method called Graph Transformer based Auto-GTAE, which models a sentence as a linguistic graph and performs feature extraction and style transfer at the graph level.
arXiv Detail & Related papers (2021-02-01T11:08:45Z) - Controllable Text Simplification with Explicit Paraphrasing [88.02804405275785]
Text Simplification improves the readability of sentences through several rewriting transformations, such as lexical paraphrasing, deletion, and splitting.
Current simplification systems are predominantly sequence-to-sequence models that are trained end-to-end to perform all these operations simultaneously.
We propose a novel hybrid approach that leverages linguistically-motivated rules for splitting and deletion, and couples them with a neural paraphrasing model to produce varied rewriting styles.
arXiv Detail & Related papers (2020-10-21T13:44:40Z) - Improving Text Generation with Student-Forcing Optimal Transport [122.11881937642401]
We propose using optimal transport (OT) to match the sequences generated in training and testing modes.
An extension is also proposed to improve the OT learning, based on the structural and contextual information of the text sequences.
The effectiveness of the proposed method is validated on machine translation, text summarization, and text generation tasks.
arXiv Detail & Related papers (2020-10-12T19:42:25Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.