The Impact of Post-editing and Machine Translation on Creativity and
Reading Experience
- URL: http://arxiv.org/abs/2101.06125v1
- Date: Fri, 15 Jan 2021 14:11:11 GMT
- Title: The Impact of Post-editing and Machine Translation on Creativity and
Reading Experience
- Authors: Ana Guerberof Arenas and Antonio Toral
- Abstract summary: This article presents the results of a study involving the translation of a fictional story from English into Catalan in three modalities.
Each translation was analysed to evaluate its creativity.
A cohort of 88 Catalan participants read the story in a randomly assigned modality and completed a survey.
- Score: 0.9543667840503736
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: This article presents the results of a study involving the translation of a
fictional story from English into Catalan in three modalities:
machine-translated (MT), post-edited (MTPE) and translated without aid (HT).
Each translation was analysed to evaluate its creativity. Subsequently, a
cohort of 88 Catalan participants read the story in a randomly assigned
modality and completed a survey. The results show that HT presented a higher
creativity score if compared to MTPE and MT. HT also ranked higher in narrative
engagement, and translation reception, while MTPE ranked marginally higher in
enjoyment. HT and MTPE show no statistically significant differences in any
category, whereas MT does in all variables tested. We conclude that creativity
is highest when professional translators intervene in the process, especially
when working without any aid. We hypothesize that creativity in translation
could be the factor that enhances reading engagement and the reception of
translated literary texts.
Related papers
- A Case Study on Contextual Machine Translation in a Professional Scenario of Subtitling [3.925328332747599]
We report on an industrial case study carried out to investigate the benefit of machine translation (MT) in a professional scenario of translating TV subtitles.
We found that post-editors marked significantly fewer context-related errors when correcting the outputs of MTCue, the context-aware model.
We also present the results of a survey of the employed post-editors, which highlights contextual inadequacy as a significant gap consistently observed in MT.
arXiv Detail & Related papers (2024-06-27T11:20:14Z) - MT-PATCHER: Selective and Extendable Knowledge Distillation from Large Language Models for Machine Translation [61.65537912700187]
Large Language Models (LLM) have demonstrated their strong ability in the field of machine translation (MT)
We propose a framework called MT-Patcher, which transfers knowledge from LLMs to existing MT models in a selective, comprehensive and proactive manner.
arXiv Detail & Related papers (2024-03-14T16:07:39Z) - To be or not to be: a translation reception study of a literary text
translated into Dutch and Catalan using machine translation [3.3453601632404073]
This article presents the results of a study involving the reception of a fictional story by Kurt Vonnegut translated from English into Catalan and Dutch.
223 participants were recruited who rated the reading conditions using three scales: Narrative Engagement, Enjoyment and Translation Reception.
arXiv Detail & Related papers (2023-07-05T15:18:52Z) - Translation-Enhanced Multilingual Text-to-Image Generation [61.41730893884428]
Research on text-to-image generation (TTI) still predominantly focuses on the English language.
In this work, we thus investigate multilingual TTI and the current potential of neural machine translation (NMT) to bootstrap mTTI systems.
We propose Ensemble Adapter (EnsAd), a novel parameter-efficient approach that learns to weigh and consolidate the multilingual text knowledge within the mTTI framework.
arXiv Detail & Related papers (2023-05-30T17:03:52Z) - Revisiting Machine Translation for Cross-lingual Classification [91.43729067874503]
Most research in the area focuses on the multilingual models rather than the Machine Translation component.
We show that, by using a stronger MT system and mitigating the mismatch between training on original text and running inference on machine translated text, translate-test can do substantially better than previously assumed.
arXiv Detail & Related papers (2023-05-23T16:56:10Z) - Discourse Centric Evaluation of Machine Translation with a Densely
Annotated Parallel Corpus [82.07304301996562]
This paper presents a new dataset with rich discourse annotations, built upon the large-scale parallel corpus BWB introduced in Jiang et al.
We investigate the similarities and differences between the discourse structures of source and target languages.
We discover that MT outputs differ fundamentally from human translations in terms of their latent discourse structures.
arXiv Detail & Related papers (2023-05-18T17:36:41Z) - Tackling Ambiguity with Images: Improved Multimodal Machine Translation
and Contrastive Evaluation [72.6667341525552]
We present a new MMT approach based on a strong text-only MT model, which uses neural adapters and a novel guided self-attention mechanism.
We also introduce CoMMuTE, a Contrastive Multimodal Translation Evaluation set of ambiguous sentences and their possible translations.
Our approach obtains competitive results compared to strong text-only models on standard English-to-French, English-to-German and English-to-Czech benchmarks.
arXiv Detail & Related papers (2022-12-20T10:18:18Z) - Exploring Document-Level Literary Machine Translation with Parallel
Paragraphs from World Literature [35.1398797683712]
We show that literary translators prefer reference human translations over machine-translated paragraphs at a rate of 84%.
We train a post-editing model whose output is preferred over normal MT output at a rate of 69% by experts.
arXiv Detail & Related papers (2022-10-25T18:03:34Z) - Creativity in translation: machine translation as a constraint for
literary texts [3.3453601632404073]
This article presents the results of a study involving the translation of a short story by Kurt Vonnegut from English to Catalan and Dutch using three modalities: machine-translation (MT), post-editing (PE) and translation without aid (HT)
A neural MT system trained on literary data does not currently have the necessary capabilities for a creative translation; it renders literal solutions to translation problems.
More importantly, using MT to post-edit raw output constrains the creativity of translators, resulting in a poorer translation often not fit for publication, according to experts.
arXiv Detail & Related papers (2022-04-12T09:27:00Z) - Neural Machine Translation Quality and Post-Editing Performance [0.04654201857155095]
We focus on neural MT (NMT) of high quality, which has become the state-of-the-art approach since then and also got adopted by most translation companies.
Across all models, we found that better MT systems indeed lead to fewer changes in the sentences in this industry setting.
Contrary to the results on phrase-based MT, BLEU is definitely not a stable predictor of the time or final output quality.
arXiv Detail & Related papers (2021-09-10T17:56:02Z) - mT6: Multilingual Pretrained Text-to-Text Transformer with Translation
Pairs [51.67970832510462]
We improve multilingual text-to-text transfer Transformer with translation pairs (mT6)
We explore three cross-lingual text-to-text pre-training tasks, namely, machine translation, translation pair span corruption, and translation span corruption.
Experimental results show that the proposed mT6 improves cross-lingual transferability over mT5.
arXiv Detail & Related papers (2021-04-18T03:24:07Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.