Context-aware Neural Machine Translation for English-Japanese Business
Scene Dialogues
- URL: http://arxiv.org/abs/2311.11976v1
- Date: Mon, 20 Nov 2023 18:06:03 GMT
- Title: Context-aware Neural Machine Translation for English-Japanese Business
Scene Dialogues
- Authors: Sumire Honda, Patrick Fernandes, Chrysoula Zerva
- Abstract summary: This paper explores how context-awareness can improve the performance of the current Neural Machine Translation (NMT) models for English-Japanese business dialogues translation.
We propose novel context tokens encoding extra-sentential information, such as speaker turn and scene type.
We find that models leverage both preceding sentences and extra-sentential context (with CXMI increasing with context size) and we provide a more focused analysis on honorifics translation.
- Score: 14.043741721036543
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Despite the remarkable advancements in machine translation, the current
sentence-level paradigm faces challenges when dealing with highly-contextual
languages like Japanese. In this paper, we explore how context-awareness can
improve the performance of the current Neural Machine Translation (NMT) models
for English-Japanese business dialogues translation, and what kind of context
provides meaningful information to improve translation. As business dialogue
involves complex discourse phenomena but offers scarce training resources, we
adapted a pretrained mBART model, finetuning on multi-sentence dialogue data,
which allows us to experiment with different contexts. We investigate the
impact of larger context sizes and propose novel context tokens encoding
extra-sentential information, such as speaker turn and scene type. We make use
of Conditional Cross-Mutual Information (CXMI) to explore how much of the
context the model uses and generalise CXMI to study the impact of the
extra-sentential context. Overall, we find that models leverage both preceding
sentences and extra-sentential context (with CXMI increasing with context size)
and we provide a more focused analysis on honorifics translation. Regarding
translation quality, increased source-side context paired with scene and
speaker information improves the model performance compared to previous work
and our context-agnostic baselines, measured in BLEU and COMET metrics.
Related papers
- Improving Long Context Document-Level Machine Translation [51.359400776242786]
Document-level context for neural machine translation (NMT) is crucial to improve translation consistency and cohesion.
Many works have been published on the topic of document-level NMT, but most restrict the system to just local context.
We propose a constrained attention variant that focuses the attention on the most relevant parts of the sequence, while simultaneously reducing the memory consumption.
arXiv Detail & Related papers (2023-06-08T13:28:48Z) - MTCue: Learning Zero-Shot Control of Extra-Textual Attributes by
Leveraging Unstructured Context in Neural Machine Translation [3.703767478524629]
This work introduces MTCue, a novel neural machine translation (NMT) framework that interprets all context (including discrete variables) as text.
MTCue learns an abstract representation of context, enabling transferability across different data settings.
MTCue significantly outperforms a "tagging" baseline at translating English text.
arXiv Detail & Related papers (2023-05-25T10:06:08Z) - Challenges in Context-Aware Neural Machine Translation [39.89082986080746]
Context-aware neural machine translation involves leveraging information beyond sentence-level context to resolve discourse dependencies.
Despite well-reasoned intuitions, most context-aware translation models show only modest improvements over sentence-level systems.
We propose a more realistic setting for document-level translation, called paragraph-to-paragraph (para2para) translation.
arXiv Detail & Related papers (2023-05-23T07:08:18Z) - HanoiT: Enhancing Context-aware Translation via Selective Context [95.93730812799798]
Context-aware neural machine translation aims to use the document-level context to improve translation quality.
The irrelevant or trivial words may bring some noise and distract the model from learning the relationship between the current sentence and the auxiliary context.
We propose a novel end-to-end encoder-decoder model with a layer-wise selection mechanism to sift and refine the long document context.
arXiv Detail & Related papers (2023-01-17T12:07:13Z) - Controlling Extra-Textual Attributes about Dialogue Participants: A Case
Study of English-to-Polish Neural Machine Translation [4.348327991071386]
Machine translation models need to opt for a certain interpretation of textual context when translating from English to Polish.
We propose a case study where a wide range of approaches for controlling attributes in translation is employed.
The best model achieves an improvement of +5.81 chrF++/+6.03 BLEU, with other models achieving competitive performance.
arXiv Detail & Related papers (2022-05-10T08:45:39Z) - SMDT: Selective Memory-Augmented Neural Document Translation [53.4627288890316]
We propose a Selective Memory-augmented Neural Document Translation model to deal with documents containing large hypothesis space of context.
We retrieve similar bilingual sentence pairs from the training corpus to augment global context.
We extend the two-stream attention model with selective mechanism to capture local context and diverse global contexts.
arXiv Detail & Related papers (2022-01-05T14:23:30Z) - When Does Translation Require Context? A Data-driven, Multilingual
Exploration [71.43817945875433]
proper handling of discourse significantly contributes to the quality of machine translation (MT)
Recent works in context-aware MT attempt to target a small set of discourse phenomena during evaluation.
We develop the Multilingual Discourse-Aware benchmark, a series of taggers that identify and evaluate model performance on discourse phenomena.
arXiv Detail & Related papers (2021-09-15T17:29:30Z) - Contrastive Learning for Context-aware Neural Machine TranslationUsing
Coreference Information [14.671424999873812]
We propose CorefCL, a novel data augmentation and contrastive learning scheme based on coreference between the source and contextual sentences.
By corrupting automatically detected coreference mentions in the contextual sentence, CorefCL can train the model to be sensitive to coreference inconsistency.
In experiments, our method consistently improved BLEU of compared models on English-German and English-Korean tasks.
arXiv Detail & Related papers (2021-09-13T05:18:47Z) - Modeling Bilingual Conversational Characteristics for Neural Chat
Translation [24.94474722693084]
We aim to promote the translation quality of conversational text by modeling the above properties.
We evaluate our approach on the benchmark dataset BConTrasT (English-German) and a self-collected bilingual dialogue corpus, named BMELD (English-Chinese)
Our approach notably boosts the performance over strong baselines by a large margin and significantly surpasses some state-of-the-art context-aware NMT models in terms of BLEU and TER.
arXiv Detail & Related papers (2021-07-23T12:23:34Z) - Measuring and Increasing Context Usage in Context-Aware Machine
Translation [64.5726087590283]
We introduce a new metric, conditional cross-mutual information, to quantify the usage of context by machine translation models.
We then introduce a new, simple training method, context-aware word dropout, to increase the usage of context by context-aware models.
arXiv Detail & Related papers (2021-05-07T19:55:35Z) - Contextual Neural Machine Translation Improves Translation of Cataphoric
Pronouns [50.245845110446496]
We investigate the effect of future sentences as context by comparing the performance of a contextual NMT model trained with the future context to the one trained with the past context.
Our experiments and evaluation, using generic and pronoun-focused automatic metrics, show that the use of future context achieves significant improvements over the context-agnostic Transformer.
arXiv Detail & Related papers (2020-04-21T10:45:48Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.