Neural Machine Translation: Challenges, Progress and Future
- URL: http://arxiv.org/abs/2004.05809v1
- Date: Mon, 13 Apr 2020 07:53:57 GMT
- Title: Neural Machine Translation: Challenges, Progress and Future
- Authors: Jiajun Zhang and Chengqing Zong
- Abstract summary: Machine translation (MT) is a technique that leverages computers to translate human languages automatically.
neural machine translation (NMT) models direct mapping between source and target languages with deep neural networks.
This article makes a review of NMT framework, discusses the challenges in NMT and introduces some exciting recent progresses.
- Score: 62.75523637241876
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Machine translation (MT) is a technique that leverages computers to translate
human languages automatically. Nowadays, neural machine translation (NMT) which
models direct mapping between source and target languages with deep neural
networks has achieved a big breakthrough in translation performance and become
the de facto paradigm of MT. This article makes a review of NMT framework,
discusses the challenges in NMT, introduces some exciting recent progresses and
finally looks forward to some potential future research trends. In addition, we
maintain the state-of-the-art methods for various NMT tasks at the website
https://github.com/ZNLP/SOTA-MT.
Related papers
- Extending Multilingual Machine Translation through Imitation Learning [60.15671816513614]
Imit-MNMT treats the task as an imitation learning process, which mimicks the behavior of an expert.
We show that our approach significantly improves the translation performance between the new and the original languages.
We also demonstrate that our approach is capable of solving copy and off-target problems.
arXiv Detail & Related papers (2023-11-14T21:04:03Z) - Code-Switching with Word Senses for Pretraining in Neural Machine
Translation [107.23743153715799]
We introduce Word Sense Pretraining for Neural Machine Translation (WSP-NMT)
WSP-NMT is an end-to-end approach for pretraining multilingual NMT models leveraging word sense-specific information from Knowledge Bases.
Our experiments show significant improvements in overall translation quality.
arXiv Detail & Related papers (2023-10-21T16:13:01Z) - Prompting Neural Machine Translation with Translation Memories [32.5633128085849]
We present a simple but effective method to introduce TMs into neural machine translation (NMT) systems.
Specifically, we treat TMs as prompts to the NMT model at test time, but leave the training process unchanged.
The result is a slight update of an existing NMT system, which can be implemented in a few hours by anyone who is familiar with NMT.
arXiv Detail & Related papers (2023-01-13T03:33:26Z) - Learning Domain Specific Language Models for Automatic Speech
Recognition through Machine Translation [0.0]
We use Neural Machine Translation as an intermediate step to first obtain translations of task-specific text data.
We develop a procedure to derive word confusion networks from NMT beam search graphs.
We demonstrate that NMT confusion networks can help to reduce the perplexity of both n-gram and recurrent neural network LMs.
arXiv Detail & Related papers (2021-09-21T10:29:20Z) - Language Modeling, Lexical Translation, Reordering: The Training Process
of NMT through the Lens of Classical SMT [64.1841519527504]
neural machine translation uses a single neural network to model the entire translation process.
Despite neural machine translation being de-facto standard, it is still not clear how NMT models acquire different competences over the course of training.
arXiv Detail & Related papers (2021-09-03T09:38:50Z) - Towards Personalised and Document-level Machine Translation of Dialogue [0.0]
This thesis proposal focuses on PersNMT and DocNMT for the domain of dialogue extracted from TV subtitles in five languages.
Three main challenges are addressed: (1) incorporating extra-textual information directly into NMT systems; (2) improving the machine translation of cohesion devices; and (3) reliable evaluation for PersNMT and DocNMT.
arXiv Detail & Related papers (2021-02-11T09:18:20Z) - Neural Machine Translation: A Review of Methods, Resources, and Tools [47.96141994224423]
Machine translation (MT) is an important sub-field of natural language processing.
End-to-end neural machine translation (NMT) has achieved great success and has become the new mainstream method in practical MT systems.
arXiv Detail & Related papers (2020-12-31T09:35:27Z) - Explicit Reordering for Neural Machine Translation [50.70683739103066]
In Transformer-based neural machine translation (NMT), the positional encoding mechanism helps the self-attention networks to learn the source representation with order dependency.
We propose a novel reordering method to explicitly model this reordering information for the Transformer-based NMT.
The empirical results on the WMT14 English-to-German, WAT ASPEC Japanese-to-English, and WMT17 Chinese-to-English translation tasks show the effectiveness of the proposed approach.
arXiv Detail & Related papers (2020-04-08T05:28:46Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.