Prose2Poem: The Blessing of Transformers in Translating Prose to Persian
Poetry
- URL: http://arxiv.org/abs/2109.14934v2
- Date: Fri, 1 Oct 2021 07:04:49 GMT
- Title: Prose2Poem: The Blessing of Transformers in Translating Prose to Persian
Poetry
- Authors: Reza Khanmohammadi, Mitra Sadat Mirshafiee, Yazdan Rezaee Jouryabi,
Seyed Abolghasem Mirroshandel
- Abstract summary: We introduce a novel Neural Machine Translation (NMT) approach to translate prose to ancient Persian poetry.
We trained a Transformer model from scratch to obtain initial translations and pretrained different variations of BERT to obtain final translations.
- Score: 2.15242029196761
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Persian Poetry has consistently expressed its philosophy, wisdom, speech, and
rationale on the basis of its couplets, making it an enigmatic language on its
own to both native and non-native speakers. Nevertheless, the notice able gap
between Persian prose and poem has left the two pieces of literature
medium-less. Having curated a parallel corpus of prose and their equivalent
poems, we introduce a novel Neural Machine Translation (NMT) approach to
translate prose to ancient Persian poetry using transformer-based Language
Models in an extremely low-resource setting. More specifically, we trained a
Transformer model from scratch to obtain initial translations and pretrained
different variations of BERT to obtain final translations. To address the
challenge of using masked language modelling under poeticness criteria, we
heuristically joined the two models and generated valid poems in terms of
automatic and human assessments. Final results demonstrate the eligibility and
creativity of our novel heuristically aided approach among Literature
professionals and non-professionals in generating novel Persian poems.
Related papers
- (Perhaps) Beyond Human Translation: Harnessing Multi-Agent Collaboration for Translating Ultra-Long Literary Texts [52.18246881218829]
We introduce a novel multi-agent framework based on large language models (LLMs) for literary translation, implemented as a company called TransAgents.
To evaluate the effectiveness of our system, we propose two innovative evaluation strategies: Monolingual Human Preference (MHP) and Bilingual LLM Preference (BLP)
arXiv Detail & Related papers (2024-05-20T05:55:08Z) - Vietnamese Poem Generation & The Prospect Of Cross-Language Poem-To-Poem
Translation [0.0]
We propose using Large Language Models to generate Vietnamese poems from natural language prompts.
The GPT-3 Babbage variant achieves a custom evaluation score of 0.8, specifically tailored to the "luc bat" genre of Vietnamese poetry.
arXiv Detail & Related papers (2024-01-02T07:46:34Z) - PoetryDiffusion: Towards Joint Semantic and Metrical Manipulation in
Poetry Generation [58.36105306993046]
Controllable text generation is a challenging and meaningful field in natural language generation (NLG)
In this paper, we pioneer the use of the Diffusion model for generating sonnets and Chinese SongCi poetry.
Our model outperforms existing models in automatic evaluation of semantic, metrical, and overall performance as well as human evaluation.
arXiv Detail & Related papers (2023-06-14T11:57:31Z) - PoeticTTS -- Controllable Poetry Reading for Literary Studies [21.29478270833139]
We resynthesise poems by cloning prosodic values from a human reference recitation, and afterwards make use of fine-grained prosody control to manipulate the synthetic speech.
We find that finetuning our TTS model on poetry captures poetic intonation patterns to a large extent which is beneficial for prosody cloning and manipulation.
arXiv Detail & Related papers (2022-07-11T13:15:27Z) - BACON: Deep-Learning Powered AI for Poetry Generation with Author
Linguistic Style Transfer [91.3755431537592]
This paper describes BACON, a prototype of an automatic poetry generator with author linguistic style transfer.
It combines concepts and techniques from finite state machinery, probabilistic models, artificial neural networks and deep learning, to write original poetry with rich aesthetic-qualities in the style of any given author.
arXiv Detail & Related papers (2021-12-14T00:08:36Z) - Don't Go Far Off: An Empirical Study on Neural Poetry Translation [13.194404923699782]
We present an empirical investigation for poetry translation along several dimensions.
We contribute a parallel dataset of poetry translations for several language pairs.
Our results show that multilingual fine-tuning on poetic text significantly outperforms multilingual fine-tuning on non-poetic text that is 35X larger in size.
arXiv Detail & Related papers (2021-09-07T10:00:44Z) - CCPM: A Chinese Classical Poetry Matching Dataset [50.90794811956129]
We propose a novel task to assess a model's semantic understanding of poetry by poem matching.
This task requires the model to select one line of Chinese classical poetry among four candidates according to the modern Chinese translation of a line of poetry.
To construct this dataset, we first obtain a set of parallel data of Chinese classical poetry and modern Chinese translation.
arXiv Detail & Related papers (2021-06-03T16:49:03Z) - Acrostic Poem Generation [26.604889384391726]
We propose a new task in the area of computational creativity: acrostic poem generation in English.
Acrostic poems are poems that contain a hidden message; typically, the first letter of each line spells out a word or short phrase.
Our experiments show that the acrostic poems generated by our baseline are received well by humans and do not lose much quality due to the additional constraints.
arXiv Detail & Related papers (2020-10-05T18:00:15Z) - Generating Major Types of Chinese Classical Poetry in a Uniformed
Framework [88.57587722069239]
We propose a GPT-2 based framework for generating major types of Chinese classical poems.
Preliminary results show this enhanced model can generate Chinese classical poems of major types with high quality in both form and content.
arXiv Detail & Related papers (2020-03-13T14:16:25Z) - MixPoet: Diverse Poetry Generation via Learning Controllable Mixed
Latent Space [79.70053419040902]
We propose MixPoet, a novel model that absorbs multiple factors to create various styles and promote diversity.
Based on a semi-supervised variational autoencoder, our model disentangles the latent space into some subspaces, with each conditioned on one influence factor by adversarial training.
Experiment results on Chinese poetry demonstrate that MixPoet improves both diversity and quality against three state-of-the-art models.
arXiv Detail & Related papers (2020-03-13T03:31:29Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.