A Comprehensive Survey of Multilingual Neural Machine Translation
- URL: http://arxiv.org/abs/2001.01115v2
- Date: Tue, 7 Jan 2020 16:54:33 GMT
- Title: A Comprehensive Survey of Multilingual Neural Machine Translation
- Authors: Raj Dabre, Chenhui Chu, Anoop Kunchukuttan
- Abstract summary: We present a survey on multilingual neural machine translation (MNMT)
MNMT is more promising than its statistical machine translation counterpart because end-to-end modeling and distributed representations open new avenues for research on machine translation.
We first categorize various approaches based on their central use-case and then further categorize them based on resource scenarios, underlying modeling principles, core-issues and challenges.
- Score: 22.96845346423759
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We present a survey on multilingual neural machine translation (MNMT), which
has gained a lot of traction in the recent years. MNMT has been useful in
improving translation quality as a result of translation knowledge transfer
(transfer learning). MNMT is more promising and interesting than its
statistical machine translation counterpart because end-to-end modeling and
distributed representations open new avenues for research on machine
translation. Many approaches have been proposed in order to exploit
multilingual parallel corpora for improving translation quality. However, the
lack of a comprehensive survey makes it difficult to determine which approaches
are promising and hence deserve further exploration. In this paper, we present
an in-depth survey of existing literature on MNMT. We first categorize various
approaches based on their central use-case and then further categorize them
based on resource scenarios, underlying modeling principles, core-issues and
challenges. Wherever possible we address the strengths and weaknesses of
several techniques by comparing them with each other. We also discuss the
future directions that MNMT research might take. This paper is aimed towards
both, beginners and experts in NMT. We hope this paper will serve as a starting
point as well as a source of new ideas for researchers and engineers interested
in MNMT.
Related papers
- Extending Multilingual Machine Translation through Imitation Learning [60.15671816513614]
Imit-MNMT treats the task as an imitation learning process, which mimicks the behavior of an expert.
We show that our approach significantly improves the translation performance between the new and the original languages.
We also demonstrate that our approach is capable of solving copy and off-target problems.
arXiv Detail & Related papers (2023-11-14T21:04:03Z) - Towards Effective Disambiguation for Machine Translation with Large
Language Models [65.80775710657672]
We study the capabilities of large language models to translate "ambiguous sentences"
Experiments show that our methods can match or outperform state-of-the-art systems such as DeepL and NLLB in four out of five language directions.
arXiv Detail & Related papers (2023-09-20T22:22:52Z) - Revisiting Machine Translation for Cross-lingual Classification [91.43729067874503]
Most research in the area focuses on the multilingual models rather than the Machine Translation component.
We show that, by using a stronger MT system and mitigating the mismatch between training on original text and running inference on machine translated text, translate-test can do substantially better than previously assumed.
arXiv Detail & Related papers (2023-05-23T16:56:10Z) - Exploring Human-Like Translation Strategy with Large Language Models [93.49333173279508]
Large language models (LLMs) have demonstrated impressive capabilities in general scenarios.
This work proposes the MAPS framework, which stands for Multi-Aspect Prompting and Selection.
We employ a selection mechanism based on quality estimation to filter out noisy and unhelpful knowledge.
arXiv Detail & Related papers (2023-05-06T19:03:12Z) - Neural Machine Translation For Low Resource Languages [0.0]
This paper investigates the realm of low resource languages and builds a Neural Machine Translation model to achieve state-of-the-art results.
The paper looks to build upon the mBART language model and explore strategies to augment it with various NLP and Deep Learning techniques.
arXiv Detail & Related papers (2023-04-16T19:27:48Z) - Towards Better Chinese-centric Neural Machine Translation for
Low-resource Languages [12.374365655284342]
Building a neural machine translation (NMT) system has become an urgent trend, especially in the low-resource setting.
Recent work tends to study NMT systems for low-resource languages centered on English, while few works focus on low-resource NMT systems centered on other languages such as Chinese.
We present the winner competition system that leverages monolingual word embeddings data enhancement, bilingual curriculum learning, and contrastive re-ranking.
arXiv Detail & Related papers (2022-04-09T01:05:37Z) - Neural Machine Translation for Low-Resource Languages: A Survey [2.3394850341375615]
This paper presents a detailed survey of research advancements in low-resource language NMT (LRL-NMT)
It provides guidelines to select the possible NMT technique for a given LRL data setting.
It also provides a list of recommendations to further enhance the research efforts on LRL-NMT.
arXiv Detail & Related papers (2021-06-29T06:31:58Z) - Neural Machine Translation: A Review of Methods, Resources, and Tools [47.96141994224423]
Machine translation (MT) is an important sub-field of natural language processing.
End-to-end neural machine translation (NMT) has achieved great success and has become the new mainstream method in practical MT systems.
arXiv Detail & Related papers (2020-12-31T09:35:27Z) - Dual Past and Future for Neural Machine Translation [51.418245676894465]
We present a novel dual framework that leverages both source-to-target and target-to-source NMT models to provide a more direct and accurate supervision signal for the Past and Future modules.
Experimental results demonstrate that our proposed method significantly improves the adequacy of NMT predictions and surpasses previous methods in two well-studied translation tasks.
arXiv Detail & Related papers (2020-07-15T14:52:24Z) - Neural Machine Translation: Challenges, Progress and Future [62.75523637241876]
Machine translation (MT) is a technique that leverages computers to translate human languages automatically.
neural machine translation (NMT) models direct mapping between source and target languages with deep neural networks.
This article makes a review of NMT framework, discusses the challenges in NMT and introduces some exciting recent progresses.
arXiv Detail & Related papers (2020-04-13T07:53:57Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.