An In-depth Walkthrough on Evolution of Neural Machine Translation
- URL: http://arxiv.org/abs/2004.04902v1
- Date: Fri, 10 Apr 2020 04:21:05 GMT
- Title: An In-depth Walkthrough on Evolution of Neural Machine Translation
- Authors: Rohan Jagtap, Dr. Sudhir N. Dhage
- Abstract summary: This paper aims to study the major trends in Neural Machine Translation, the state of the art models in the domain and a high level comparison between them.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Neural Machine Translation (NMT) methodologies have burgeoned from using
simple feed-forward architectures to the state of the art; viz. BERT model. The
use cases of NMT models have been broadened from just language translations to
conversational agents (chatbots), abstractive text summarization, image
captioning, etc. which have proved to be a gem in their respective
applications. This paper aims to study the major trends in Neural Machine
Translation, the state of the art models in the domain and a high level
comparison between them.
Related papers
- Low-resource neural machine translation with morphological modeling [3.3721926640077804]
Morphological modeling in neural machine translation (NMT) is a promising approach to achieving open-vocabulary machine translation.
We propose a framework-solution for modeling complex morphology in low-resource settings.
We evaluate our proposed solution on Kinyarwanda - English translation using public-domain parallel text.
arXiv Detail & Related papers (2024-04-03T01:31:41Z) - Anatomy of Neural Language Models [0.0]
Transformer-based Language Models (LMs) have led to new state-of-the-art results in a wide spectrum of applications.
Transformers pretrained on language-modeling-like tasks have been widely adopted in computer vision and time series applications.
arXiv Detail & Related papers (2024-01-08T10:27:25Z) - Neural Machine Translation For Low Resource Languages [0.0]
This paper investigates the realm of low resource languages and builds a Neural Machine Translation model to achieve state-of-the-art results.
The paper looks to build upon the mBART language model and explore strategies to augment it with various NLP and Deep Learning techniques.
arXiv Detail & Related papers (2023-04-16T19:27:48Z) - Learning to Generalize to More: Continuous Semantic Augmentation for
Neural Machine Translation [50.54059385277964]
We present a novel data augmentation paradigm termed Continuous Semantic Augmentation (CsaNMT)
CsaNMT augments each training instance with an adjacency region that could cover adequate variants of literal expression under the same meaning.
arXiv Detail & Related papers (2022-04-14T08:16:28Z) - SMDT: Selective Memory-Augmented Neural Document Translation [53.4627288890316]
We propose a Selective Memory-augmented Neural Document Translation model to deal with documents containing large hypothesis space of context.
We retrieve similar bilingual sentence pairs from the training corpus to augment global context.
We extend the two-stream attention model with selective mechanism to capture local context and diverse global contexts.
arXiv Detail & Related papers (2022-01-05T14:23:30Z) - Language Modeling, Lexical Translation, Reordering: The Training Process
of NMT through the Lens of Classical SMT [64.1841519527504]
neural machine translation uses a single neural network to model the entire translation process.
Despite neural machine translation being de-facto standard, it is still not clear how NMT models acquire different competences over the course of training.
arXiv Detail & Related papers (2021-09-03T09:38:50Z) - Synthetic Source Language Augmentation for Colloquial Neural Machine
Translation [3.303435360096988]
We develop a novel colloquial Indonesian-English test-set collected from YouTube transcript and Twitter.
We perform synthetic style augmentation to the source of formal Indonesian language and show that it improves the baseline Id-En models.
arXiv Detail & Related papers (2020-12-30T14:52:15Z) - Neural Machine Translation: Challenges, Progress and Future [62.75523637241876]
Machine translation (MT) is a technique that leverages computers to translate human languages automatically.
neural machine translation (NMT) models direct mapping between source and target languages with deep neural networks.
This article makes a review of NMT framework, discusses the challenges in NMT and introduces some exciting recent progresses.
arXiv Detail & Related papers (2020-04-13T07:53:57Z) - Learning Contextualized Sentence Representations for Document-Level
Neural Machine Translation [59.191079800436114]
Document-level machine translation incorporates inter-sentential dependencies into the translation of a source sentence.
We propose a new framework to model cross-sentence dependencies by training neural machine translation (NMT) to predict both the target translation and surrounding sentences of a source sentence.
arXiv Detail & Related papers (2020-03-30T03:38:01Z) - Towards Making the Most of Context in Neural Machine Translation [112.9845226123306]
We argue that previous research did not make a clear use of the global context.
We propose a new document-level NMT framework that deliberately models the local context of each sentence.
arXiv Detail & Related papers (2020-02-19T03:30:00Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.