DiDi's Machine Translation System for WMT2020
- URL: http://arxiv.org/abs/2010.08185v1
- Date: Fri, 16 Oct 2020 06:25:48 GMT
- Title: DiDi's Machine Translation System for WMT2020
- Authors: Tanfang Chen, Weiwei Wang, Wenyang Wei, Xing Shi, Xiangang Li, Jieping
Ye, Kevin Knight
- Abstract summary: We participate in the translation direction of Chinese->English.
In this direction, we use the Transformer as our baseline model.
As a result, our submission achieves a BLEU score of $36.6$ in Chinese->English.
- Score: 51.296629834996246
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: This paper describes DiDi AI Labs' submission to the WMT2020 news translation
shared task. We participate in the translation direction of Chinese->English.
In this direction, we use the Transformer as our baseline model, and integrate
several techniques for model enhancement, including data filtering, data
selection, back-translation, fine-tuning, model ensembling, and re-ranking. As
a result, our submission achieves a BLEU score of $36.6$ in Chinese->English.
Related papers
- HW-TSC's Submission to the CCMT 2024 Machine Translation Tasks [12.841065384808733]
We participate in the bilingual machine translation task and multi-domain machine translation task.
For these two translation tasks, we use training strategies such as regularized dropout, bidirectional training, data diversification, forward translation, back translation, alternated training, curriculum learning, and transductive ensemble learning.
arXiv Detail & Related papers (2024-09-23T09:20:19Z) - Tackling Low-Resourced Sign Language Translation: UPC at WMT-SLT 22 [4.382973957294345]
This paper describes the system developed at the Universitat Politecnica de Catalunya for the Workshop on Machine Translation 2022 Sign Language Translation Task.
We use a Transformer model implemented with the Fairseq modeling toolkit.
We have experimented with the vocabulary size, data augmentation techniques and pretraining the model with the ENIX-14T dataset.
arXiv Detail & Related papers (2022-12-02T12:42:24Z) - Summer: WeChat Neural Machine Translation Systems for the WMT22
Biomedical Translation Task [54.63368889359441]
This paper introduces WeChat's participation in WMT 2022 shared biomedical translation task on Chinese to English.
Our systems are based on the Transformer, and use several different Transformer structures to improve the quality of translation.
Our Chinese$to$English system, named Summer, achieves the highest BLEU score among all submissions.
arXiv Detail & Related papers (2022-11-28T03:10:50Z) - Tencent AI Lab - Shanghai Jiao Tong University Low-Resource Translation
System for the WMT22 Translation Task [49.916963624249355]
This paper describes Tencent AI Lab - Shanghai Jiao Tong University (TAL-SJTU) Low-Resource Translation systems for the WMT22 shared task.
We participate in the general translation task on English$Leftrightarrow$Livonian.
Our system is based on M2M100 with novel techniques that adapt it to the target language pair.
arXiv Detail & Related papers (2022-10-17T04:34:09Z) - The YiTrans End-to-End Speech Translation System for IWSLT 2022 Offline
Shared Task [92.5087402621697]
This paper describes the submission of our end-to-end YiTrans speech translation system for the IWSLT 2022 offline task.
The YiTrans system is built on large-scale pre-trained encoder-decoder models.
Our final submissions rank first on English-German and English-Chinese end-to-end systems in terms of the automatic evaluation metric.
arXiv Detail & Related papers (2022-06-12T16:13:01Z) - Tilde at WMT 2020: News Task Systems [0.38073142980733]
This paper describes Tilde's submission to the WMT 2020 shared task on news translation for both directions of the English-Polish language pair.
We build our baseline systems to be morphologically motivated sub-word unit-based Transformer base models.
Our final models are ensembles of Transformer base and Transformer big models that feature right-to-left re-ranking.
arXiv Detail & Related papers (2020-10-29T08:59:37Z) - SJTU-NICT's Supervised and Unsupervised Neural Machine Translation
Systems for the WMT20 News Translation Task [111.91077204077817]
We participated in four translation directions of three language pairs: English-Chinese, English-Polish, and German-Upper Sorbian.
Based on different conditions of language pairs, we have experimented with diverse neural machine translation (NMT) techniques.
In our submissions, the primary systems won the first place on English to Chinese, Polish to English, and German to Upper Sorbian translation directions.
arXiv Detail & Related papers (2020-10-11T00:40:05Z) - WeChat Neural Machine Translation Systems for WMT20 [61.03013964996131]
Our system is based on the Transformer with effective variants and the DTMT architecture.
In our experiments, we employ data selection, several synthetic data generation approaches, advanced finetuning approaches and self-bleu based model ensemble.
Our constrained Chinese to English system achieves 36.9 case-sensitive BLEU score, which is the highest among all submissions.
arXiv Detail & Related papers (2020-10-01T08:15:09Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.