Principle Interference in Technical and Scientific Translation
- URL: http://arxiv.org/abs/2401.00177v1
- Date: Sat, 30 Dec 2023 09:04:30 GMT
- Title: Principle Interference in Technical and Scientific Translation
- Authors: Mohammad Ibrahim Qani
- Abstract summary: I will have a brief overview of the historical excursion of interference in technical and scientific translation.
My aim is to explain this phenomenon and its causes with all its paradoxes, instead of simply condemning it as an example of supposedly bad translation.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: In this article, I will explore the nature of interference in translation,
especially in technical and scientific texts, using a descriptivist approach. I
will have a brief overview of the historical excursion of interference in
technical and scientific translation. My aim is to explain this phenomenon and
its causes with all its paradoxes, instead of simply condemning it as an
example of supposedly bad translation. Thus, I will focus on its status in the
bibliography of translation, on the motives for and consequences of
interference in specialized translation, as well as on the nature of the
arguments given for and against this phenomenon. Therefore the relationship
between different societies has always been possible with the act of
translation. When civilizations are examined throughout history, it is seen
that the dissemination of knowledge among different societies has been achieved
by translation. These societies have often become aware of the advancements in
technology and science by means of translation. Therefore; translation becomes
very significant in technical contact between societies and humans. Since the
translation of technical texts is the preliminary scope of this thesis, it will
be beneficial to have a brief look at the history of technical translation in
the world.
Related papers
- Understanding and Addressing the Under-Translation Problem from the Perspective of Decoding Objective [72.83966378613238]
Under-translation and over-translation remain two challenging problems in state-of-the-art Neural Machine Translation (NMT) systems.
We conduct an in-depth analysis on the underlying cause of under-translation in NMT, providing an explanation from the perspective of decoding objective.
We propose employing the confidence of predicting End Of Sentence (EOS) as a detector for under-translation, and strengthening the confidence-based penalty to penalize candidates with a high risk of under-translation.
arXiv Detail & Related papers (2024-05-29T09:25:49Z) - When Abel Kills Cain: What Machine Translation Cannot Capture [0.0]
Article aims at identifying what, from a structural point of view, AI based automatic translators cannot fully capture.
It focuses on the machine's mistakes, in order to try to explain its causes.
The biblical story of Ca"in and Abel has been chosen because of its rich and critical interpretive tradition.
arXiv Detail & Related papers (2024-04-02T12:46:00Z) - Crossing the Threshold: Idiomatic Machine Translation through Retrieval
Augmentation and Loss Weighting [66.02718577386426]
We provide a simple characterization of idiomatic translation and related issues.
We conduct a synthetic experiment revealing a tipping point at which transformer-based machine translation models correctly default to idiomatic translations.
To improve translation of natural idioms, we introduce two straightforward yet effective techniques.
arXiv Detail & Related papers (2023-10-10T23:47:25Z) - User Study for Improving Tools for Bible Translation [2.7514191327409714]
Technology has increasingly become an integral part of the Bible translation process.
Recent advances in AI could potentially play a pivotal role in reducing translation time and improving overall quality.
arXiv Detail & Related papers (2023-02-01T22:22:03Z) - Towards Debiasing Translation Artifacts [15.991970288297443]
We propose a novel approach to reducing translationese by extending an established bias-removal technique.
We use the Iterative Null-space Projection (INLP) algorithm, and show by measuring classification accuracy before and after debiasing, that translationese is reduced at both sentence and word level.
To the best of our knowledge, this is the first study to debias translationese as represented in latent embedding space.
arXiv Detail & Related papers (2022-05-16T21:46:51Z) - Time-Aware Ancient Chinese Text Translation and Inference [6.787414471399024]
We aim to address the challenges surrounding the translation of ancient Chinese text.
The linguistic gap due to the difference in eras results in translations that are poor in quality.
Most translations are missing the contextual information that is often very crucial to understanding the text.
arXiv Detail & Related papers (2021-07-07T12:23:52Z) - Measuring and Increasing Context Usage in Context-Aware Machine
Translation [64.5726087590283]
We introduce a new metric, conditional cross-mutual information, to quantify the usage of context by machine translation models.
We then introduce a new, simple training method, context-aware word dropout, to increase the usage of context by context-aware models.
arXiv Detail & Related papers (2021-05-07T19:55:35Z) - Contextual Neural Machine Translation Improves Translation of Cataphoric
Pronouns [50.245845110446496]
We investigate the effect of future sentences as context by comparing the performance of a contextual NMT model trained with the future context to the one trained with the past context.
Our experiments and evaluation, using generic and pronoun-focused automatic metrics, show that the use of future context achieves significant improvements over the context-agnostic Transformer.
arXiv Detail & Related papers (2020-04-21T10:45:48Z) - Translation Artifacts in Cross-lingual Transfer Learning [51.66536640084888]
We show that machine translation can introduce subtle artifacts that have a notable impact in existing cross-lingual models.
In natural language inference, translating the premise and the hypothesis independently can reduce the lexical overlap between them.
We also improve the state-of-the-art in XNLI for the translate-test and zero-shot approaches by 4.3 and 2.8 points, respectively.
arXiv Detail & Related papers (2020-04-09T17:54:30Z) - Sign Language Transformers: Joint End-to-end Sign Language Recognition
and Translation [59.38247587308604]
We introduce a novel transformer based architecture that jointly learns Continuous Sign Language Recognition and Translation.
We evaluate the recognition and translation performances of our approaches on the challenging RWTH-PHOENIX-Weather-2014T dataset.
Our translation networks outperform both sign video to spoken language and gloss to spoken language translation models.
arXiv Detail & Related papers (2020-03-30T21:35:09Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.