PoeticTTS -- Controllable Poetry Reading for Literary Studies
- URL: http://arxiv.org/abs/2207.05549v1
- Date: Mon, 11 Jul 2022 13:15:27 GMT
- Title: PoeticTTS -- Controllable Poetry Reading for Literary Studies
- Authors: Julia Koch, Florian Lux, Nadja Schauffler, Toni Bernhart, Felix
Dieterle, Jonas Kuhn, Sandra Richter, Gabriel Viehhauser, Ngoc Thang Vu
- Abstract summary: We resynthesise poems by cloning prosodic values from a human reference recitation, and afterwards make use of fine-grained prosody control to manipulate the synthetic speech.
We find that finetuning our TTS model on poetry captures poetic intonation patterns to a large extent which is beneficial for prosody cloning and manipulation.
- Score: 21.29478270833139
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Speech synthesis for poetry is challenging due to specific intonation
patterns inherent to poetic speech. In this work, we propose an approach to
synthesise poems with almost human like naturalness in order to enable literary
scholars to systematically examine hypotheses on the interplay between text,
spoken realisation, and the listener's perception of poems. To meet these
special requirements for literary studies, we resynthesise poems by cloning
prosodic values from a human reference recitation, and afterwards make use of
fine-grained prosody control to manipulate the synthetic speech in a
human-in-the-loop setting to alter the recitation w.r.t. specific phenomena. We
find that finetuning our TTS model on poetry captures poetic intonation
patterns to a large extent which is beneficial for prosody cloning and
manipulation and verify the success of our approach both in an objective
evaluation as well as in human studies.
Related papers
- A Computational Approach to Style in American Poetry [19.41186389974801]
We develop a method to assess the style of American poems and to visualize a collection of poems in relation to one another.
qualitative poetry criticism helped guide our development of metrics that analyze various orthographic, syntactic, and phonemic features.
Our method has potential applications to academic research of texts, to research of the intuitive personal response to poetry, and to making recommendations to readers based on their favorite poems.
arXiv Detail & Related papers (2023-10-13T18:49:14Z) - PoetryDiffusion: Towards Joint Semantic and Metrical Manipulation in
Poetry Generation [58.36105306993046]
Controllable text generation is a challenging and meaningful field in natural language generation (NLG)
In this paper, we pioneer the use of the Diffusion model for generating sonnets and Chinese SongCi poetry.
Our model outperforms existing models in automatic evaluation of semantic, metrical, and overall performance as well as human evaluation.
arXiv Detail & Related papers (2023-06-14T11:57:31Z) - PoeLM: A Meter- and Rhyme-Controllable Language Model for Unsupervised
Poetry Generation [42.12348554537587]
Formal verse poetry imposes strict constraints on the meter and rhyme scheme of poems.
Most prior work on generating this type of poetry uses existing poems for supervision.
We propose an unsupervised approach to generate poems following any given meter and rhyme scheme.
arXiv Detail & Related papers (2022-05-24T17:09:55Z) - BACON: Deep-Learning Powered AI for Poetry Generation with Author
Linguistic Style Transfer [91.3755431537592]
This paper describes BACON, a prototype of an automatic poetry generator with author linguistic style transfer.
It combines concepts and techniques from finite state machinery, probabilistic models, artificial neural networks and deep learning, to write original poetry with rich aesthetic-qualities in the style of any given author.
arXiv Detail & Related papers (2021-12-14T00:08:36Z) - Tortured phrases: A dubious writing style emerging in science. Evidence
of critical issues affecting established journals [69.76097138157816]
Probabilistic text generators have been used to produce fake scientific papers for more than a decade.
Complex AI-powered generation techniques produce texts indistinguishable from that of humans.
Some websites offer to rewrite texts for free, generating gobbledegook full of tortured phrases.
arXiv Detail & Related papers (2021-07-12T20:47:08Z) - CCPM: A Chinese Classical Poetry Matching Dataset [50.90794811956129]
We propose a novel task to assess a model's semantic understanding of poetry by poem matching.
This task requires the model to select one line of Chinese classical poetry among four candidates according to the modern Chinese translation of a line of poetry.
To construct this dataset, we first obtain a set of parallel data of Chinese classical poetry and modern Chinese translation.
arXiv Detail & Related papers (2021-06-03T16:49:03Z) - Generate and Revise: Reinforcement Learning in Neural Poetry [17.128639251861784]
We propose a framework to generate poems that are repeatedly revisited and corrected, as humans do, in order to improve their overall quality.
Our model generates poems from scratch and it learns to progressively adjust the generated text in order to match a target criterion.
We evaluate this approach in the case of matching a rhyming scheme, without having any information on which words are responsible of creating rhymes and on how to coherently alter the poem words.
arXiv Detail & Related papers (2021-02-08T10:35:33Z) - Acrostic Poem Generation [26.604889384391726]
We propose a new task in the area of computational creativity: acrostic poem generation in English.
Acrostic poems are poems that contain a hidden message; typically, the first letter of each line spells out a word or short phrase.
Our experiments show that the acrostic poems generated by our baseline are received well by humans and do not lose much quality due to the additional constraints.
arXiv Detail & Related papers (2020-10-05T18:00:15Z) - MixPoet: Diverse Poetry Generation via Learning Controllable Mixed
Latent Space [79.70053419040902]
We propose MixPoet, a novel model that absorbs multiple factors to create various styles and promote diversity.
Based on a semi-supervised variational autoencoder, our model disentangles the latent space into some subspaces, with each conditioned on one influence factor by adversarial training.
Experiment results on Chinese poetry demonstrate that MixPoet improves both diversity and quality against three state-of-the-art models.
arXiv Detail & Related papers (2020-03-13T03:31:29Z) - Introducing Aspects of Creativity in Automatic Poetry Generation [2.792030485253753]
Poetry Generation involves teaching systems to automatically generate text that resembles poetic work.
A deep learning system can learn to generate poetry on its own by training on a corpus of poems and modeling the particular style of language.
We propose taking an approach that fine-tunes GPT-2, a pre-trained language model, to our downstream task of poetry generation.
arXiv Detail & Related papers (2020-02-06T20:44:12Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.