The NiuTrans Machine Translation Systems for WMT21
- URL: http://arxiv.org/abs/2109.10485v1
- Date: Wed, 22 Sep 2021 02:00:24 GMT
- Title: The NiuTrans Machine Translation Systems for WMT21
- Authors: Shuhan Zhou, Tao Zhou, Binghao Wei, Yingfeng Luo, Yongyu Mu, Zefan
Zhou, Chenglong Wang, Xuanjun Zhou, Chuanhao Lv, Yi Jing, Laohu Wang, Jingnan
Zhang, Canan Huang, Zhongxiang Yan, Chi Hu, Bei Li, Tong Xiao and Jingbo Zhu
- Abstract summary: This paper describes NiuTrans neural machine translation systems of the WMT 2021 news translation tasks.
We made submissions to 9 language directions, including English$leftarrow$$$Chinese, Japanese, Russian, Icelandic$$ and English$rightarrow$Hausa tasks.
- Score: 23.121382706331403
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: This paper describes NiuTrans neural machine translation systems of the WMT
2021 news translation tasks. We made submissions to 9 language directions,
including English$\leftrightarrow$$\{$Chinese, Japanese, Russian, Icelandic$\}$
and English$\rightarrow$Hausa tasks. Our primary systems are built on several
effective variants of Transformer, e.g., Transformer-DLCL, ODE-Transformer. We
also utilize back-translation, knowledge distillation, post-ensemble, and
iterative fine-tuning techniques to enhance the model performance further.
Related papers
- HW-TSC's Submission to the CCMT 2024 Machine Translation Tasks [12.841065384808733]
We participate in the bilingual machine translation task and multi-domain machine translation task.
For these two translation tasks, we use training strategies such as regularized dropout, bidirectional training, data diversification, forward translation, back translation, alternated training, curriculum learning, and transductive ensemble learning.
arXiv Detail & Related papers (2024-09-23T09:20:19Z) - Summer: WeChat Neural Machine Translation Systems for the WMT22
Biomedical Translation Task [54.63368889359441]
This paper introduces WeChat's participation in WMT 2022 shared biomedical translation task on Chinese to English.
Our systems are based on the Transformer, and use several different Transformer structures to improve the quality of translation.
Our Chinese$to$English system, named Summer, achieves the highest BLEU score among all submissions.
arXiv Detail & Related papers (2022-11-28T03:10:50Z) - The YiTrans End-to-End Speech Translation System for IWSLT 2022 Offline
Shared Task [92.5087402621697]
This paper describes the submission of our end-to-end YiTrans speech translation system for the IWSLT 2022 offline task.
The YiTrans system is built on large-scale pre-trained encoder-decoder models.
Our final submissions rank first on English-German and English-Chinese end-to-end systems in terms of the automatic evaluation metric.
arXiv Detail & Related papers (2022-06-12T16:13:01Z) - The NiuTrans System for the WMT21 Efficiency Task [26.065244284992147]
This paper describes the NiuTrans system for the WMT21 translation efficiency task.
Our system can translate 247,000 words per second on an NVIDIA A100, being 3$times$ faster than last year's system.
arXiv Detail & Related papers (2021-09-16T14:21:52Z) - The Volctrans Machine Translation System for WMT20 [44.99252423430649]
This paper describes our VolcTrans system on WMT20 shared news translation task.
Our basic systems are based on Transformer, with several variants (wider or deeper Transformers, dynamic convolutions)
The final system includes text pre-process, data selection, synthetic data generation, advanced model ensemble, and multilingual pre-training.
arXiv Detail & Related papers (2020-10-28T08:08:12Z) - DiDi's Machine Translation System for WMT2020 [51.296629834996246]
We participate in the translation direction of Chinese->English.
In this direction, we use the Transformer as our baseline model.
As a result, our submission achieves a BLEU score of $36.6$ in Chinese->English.
arXiv Detail & Related papers (2020-10-16T06:25:48Z) - WeChat Neural Machine Translation Systems for WMT20 [61.03013964996131]
Our system is based on the Transformer with effective variants and the DTMT architecture.
In our experiments, we employ data selection, several synthetic data generation approaches, advanced finetuning approaches and self-bleu based model ensemble.
Our constrained Chinese to English system achieves 36.9 case-sensitive BLEU score, which is the highest among all submissions.
arXiv Detail & Related papers (2020-10-01T08:15:09Z) - Explicit Reordering for Neural Machine Translation [50.70683739103066]
In Transformer-based neural machine translation (NMT), the positional encoding mechanism helps the self-attention networks to learn the source representation with order dependency.
We propose a novel reordering method to explicitly model this reordering information for the Transformer-based NMT.
The empirical results on the WMT14 English-to-German, WAT ASPEC Japanese-to-English, and WMT17 Chinese-to-English translation tasks show the effectiveness of the proposed approach.
arXiv Detail & Related papers (2020-04-08T05:28:46Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.