Creating a digital poet
- URL: http://arxiv.org/abs/2602.16578v1
- Date: Wed, 18 Feb 2026 16:25:10 GMT
- Title: Creating a digital poet
- Authors: Vered Tohar, Tsahi Hayat, Amir Leshem,
- Abstract summary: We report a seven-month poetry workshop in which a large language model was shaped into a digital poet.<n>The model developed a distinctive style and a coherent corpus, supported by quantitative and qualitative analyses.<n>After the workshop, a commercial publisher released a poetry collection authored by the model.
- Score: 8.593632029412502
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Can a machine write good poetry? Any positive answer raises fundamental questions about the nature and value of art. We report a seven-month poetry workshop in which a large language model was shaped into a digital poet through iterative in-context expert feedback, without retraining. Across sessions, the model developed a distinctive style and a coherent corpus, supported by quantitative and qualitative analyses, and it produced a pen name and author image. In a blinded authorship test with 50 humanities students and graduates (three AI poems and three poems by well-known poets each), judgments were at chance: human poems were labeled human 54% of the time and AI poems 52%, with 95% confidence intervals including 50%. After the workshop, a commercial publisher released a poetry collection authored by the model. These results show that workshop-style prompting can support long-horizon creative shaping and renew debates on creativity and authorship.
Related papers
- The author is dead, but what if they never lived? A reception experiment on Czech AI- and human-authored poetry [0.0]
We examine the perception of AI- and human-written Czech poetry.<n>We ask if Czech native speakers are able to identify it and how they judge it.
arXiv Detail & Related papers (2025-11-26T17:53:59Z) - "It was 80% me, 20% AI": Seeking Authenticity in Co-Writing with Large Language Models [97.22914355737676]
We examine whether and how writers want to preserve their authentic voice when co-writing with AI tools.
Our findings illuminate conceptions of authenticity in human-AI co-creation.
Readers' responses showed less concern about human-AI co-writing.
arXiv Detail & Related papers (2024-11-20T04:42:32Z) - Sonnet or Not, Bot? Poetry Evaluation for Large Models and Datasets [3.0040661953201475]
Large language models (LLMs) can now generate and recognize poetry.
We develop a task to evaluate how well LLMs recognize one aspect of English-language poetry.
We show that state-of-the-art LLMs can successfully identify both common and uncommon fixed poetic forms.
arXiv Detail & Related papers (2024-06-27T05:36:53Z) - Identifying the style by a qualified reader on a short fragment of
generated poetry [0.0]
I used 3 character-based LSTM-models to work with style reproducing assessment.
All three models were trained on the corpus of texts by famous Russian-speaking poets.
accuracy of definition of style increases if the assessor can quote the poet by heart.
arXiv Detail & Related papers (2023-06-05T10:55:15Z) - Generation of Chinese classical poetry based on pre-trained model [1.6114012813668934]
This paper mainly tries to use BART and other pre training models to generate metrical poetry text.
It developed a set of AI poetry Turing problems, it was reviewed by a group of poets and poetry writing researchers.
The model of poetry generation studied by the author generalizes works that cannot be distinguished from those of advanced scholars.
arXiv Detail & Related papers (2022-11-04T16:05:31Z) - PART: Pre-trained Authorship Representation Transformer [52.623051272843426]
Authors writing documents imprint identifying information within their texts.<n>Previous works use hand-crafted features or classification tasks to train their authorship models.<n>We propose a contrastively trained model fit to learn textbfauthorship embeddings instead of semantics.
arXiv Detail & Related papers (2022-09-30T11:08:39Z) - PoeLM: A Meter- and Rhyme-Controllable Language Model for Unsupervised
Poetry Generation [42.12348554537587]
Formal verse poetry imposes strict constraints on the meter and rhyme scheme of poems.
Most prior work on generating this type of poetry uses existing poems for supervision.
We propose an unsupervised approach to generate poems following any given meter and rhyme scheme.
arXiv Detail & Related papers (2022-05-24T17:09:55Z) - BACON: Deep-Learning Powered AI for Poetry Generation with Author
Linguistic Style Transfer [91.3755431537592]
This paper describes BACON, a prototype of an automatic poetry generator with author linguistic style transfer.
It combines concepts and techniques from finite state machinery, probabilistic models, artificial neural networks and deep learning, to write original poetry with rich aesthetic-qualities in the style of any given author.
arXiv Detail & Related papers (2021-12-14T00:08:36Z) - Exploratory Data Analysis of Urdu Poetry [0.0]
This study explores the main features of Urdu ghazal that make it popular and admired more than other forms.
A detailed explanation is provided as to the types of words used for expressing love, nature, birds, and flowers etc.
arXiv Detail & Related papers (2021-12-03T20:06:11Z) - CCPM: A Chinese Classical Poetry Matching Dataset [50.90794811956129]
We propose a novel task to assess a model's semantic understanding of poetry by poem matching.
This task requires the model to select one line of Chinese classical poetry among four candidates according to the modern Chinese translation of a line of poetry.
To construct this dataset, we first obtain a set of parallel data of Chinese classical poetry and modern Chinese translation.
arXiv Detail & Related papers (2021-06-03T16:49:03Z) - MixPoet: Diverse Poetry Generation via Learning Controllable Mixed
Latent Space [79.70053419040902]
We propose MixPoet, a novel model that absorbs multiple factors to create various styles and promote diversity.
Based on a semi-supervised variational autoencoder, our model disentangles the latent space into some subspaces, with each conditioned on one influence factor by adversarial training.
Experiment results on Chinese poetry demonstrate that MixPoet improves both diversity and quality against three state-of-the-art models.
arXiv Detail & Related papers (2020-03-13T03:31:29Z) - Introducing Aspects of Creativity in Automatic Poetry Generation [2.792030485253753]
Poetry Generation involves teaching systems to automatically generate text that resembles poetic work.
A deep learning system can learn to generate poetry on its own by training on a corpus of poems and modeling the particular style of language.
We propose taking an approach that fine-tunes GPT-2, a pre-trained language model, to our downstream task of poetry generation.
arXiv Detail & Related papers (2020-02-06T20:44:12Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.