Zero-shot Sonnet Generation with Discourse-level Planning and Aesthetics
Features
- URL: http://arxiv.org/abs/2205.01821v1
- Date: Tue, 3 May 2022 23:44:28 GMT
- Title: Zero-shot Sonnet Generation with Discourse-level Planning and Aesthetics
Features
- Authors: Yufei Tian and Nanyun Peng
- Abstract summary: We present a novel framework to generate sonnets that does not require training on poems.
Specifically, a content planning module is trained on non-poetic texts to obtain discourse-level coherence.
We also design a constrained decoding algorithm to impose the meter-and-rhyme constraint of the generated sonnets.
- Score: 37.45490765899826
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Poetry generation, and creative language generation in general, usually
suffers from the lack of large training data. In this paper, we present a novel
framework to generate sonnets that does not require training on poems. We
design a hierarchical framework which plans the poem sketch before decoding.
Specifically, a content planning module is trained on non-poetic texts to
obtain discourse-level coherence; then a rhyme module generates rhyme words and
a polishing module introduces imagery and similes for aesthetics purposes.
Finally, we design a constrained decoding algorithm to impose the
meter-and-rhyme constraint of the generated sonnets. Automatic and human
evaluation show that our multi-stage approach without training on poem corpora
generates more coherent, poetic, and creative sonnets than several strong
baselines.
Related papers
- Let the Poem Hit the Rhythm: Using a Byte-Based Transformer for Beat-Aligned Poetry Generation [1.03590082373586]
This paper explores whether a byte-based language model can generate words that fit specific beat patterns within the context of poetry.
We develop a method to train a transformer model, ByT5, to align poems with beat patterns.
The results demonstrate a high level of beat alignment while maintaining semantic coherence.
arXiv Detail & Related papers (2024-06-14T16:54:48Z) - PoetryDiffusion: Towards Joint Semantic and Metrical Manipulation in
Poetry Generation [58.36105306993046]
Controllable text generation is a challenging and meaningful field in natural language generation (NLG)
In this paper, we pioneer the use of the Diffusion model for generating sonnets and Chinese SongCi poetry.
Our model outperforms existing models in automatic evaluation of semantic, metrical, and overall performance as well as human evaluation.
arXiv Detail & Related papers (2023-06-14T11:57:31Z) - Unsupervised Melody-to-Lyric Generation [91.29447272400826]
We propose a method for generating high-quality lyrics without training on any aligned melody-lyric data.
We leverage the segmentation and rhythm alignment between melody and lyrics to compile the given melody into decoding constraints.
Our model can generate high-quality lyrics that are more on-topic, singable, intelligible, and coherent than strong baselines.
arXiv Detail & Related papers (2023-05-30T17:20:25Z) - Unsupervised Melody-Guided Lyrics Generation [84.22469652275714]
We propose to generate pleasantly listenable lyrics without training on melody-lyric aligned data.
We leverage the crucial alignments between melody and lyrics and compile the given melody into constraints to guide the generation process.
arXiv Detail & Related papers (2023-05-12T20:57:20Z) - PoeLM: A Meter- and Rhyme-Controllable Language Model for Unsupervised
Poetry Generation [42.12348554537587]
Formal verse poetry imposes strict constraints on the meter and rhyme scheme of poems.
Most prior work on generating this type of poetry uses existing poems for supervision.
We propose an unsupervised approach to generate poems following any given meter and rhyme scheme.
arXiv Detail & Related papers (2022-05-24T17:09:55Z) - CCPM: A Chinese Classical Poetry Matching Dataset [50.90794811956129]
We propose a novel task to assess a model's semantic understanding of poetry by poem matching.
This task requires the model to select one line of Chinese classical poetry among four candidates according to the modern Chinese translation of a line of poetry.
To construct this dataset, we first obtain a set of parallel data of Chinese classical poetry and modern Chinese translation.
arXiv Detail & Related papers (2021-06-03T16:49:03Z) - Generate and Revise: Reinforcement Learning in Neural Poetry [17.128639251861784]
We propose a framework to generate poems that are repeatedly revisited and corrected, as humans do, in order to improve their overall quality.
Our model generates poems from scratch and it learns to progressively adjust the generated text in order to match a target criterion.
We evaluate this approach in the case of matching a rhyming scheme, without having any information on which words are responsible of creating rhymes and on how to coherently alter the poem words.
arXiv Detail & Related papers (2021-02-08T10:35:33Z) - Acrostic Poem Generation [26.604889384391726]
We propose a new task in the area of computational creativity: acrostic poem generation in English.
Acrostic poems are poems that contain a hidden message; typically, the first letter of each line spells out a word or short phrase.
Our experiments show that the acrostic poems generated by our baseline are received well by humans and do not lose much quality due to the additional constraints.
arXiv Detail & Related papers (2020-10-05T18:00:15Z) - Introducing Aspects of Creativity in Automatic Poetry Generation [2.792030485253753]
Poetry Generation involves teaching systems to automatically generate text that resembles poetic work.
A deep learning system can learn to generate poetry on its own by training on a corpus of poems and modeling the particular style of language.
We propose taking an approach that fine-tunes GPT-2, a pre-trained language model, to our downstream task of poetry generation.
arXiv Detail & Related papers (2020-02-06T20:44:12Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.