Amharic Abstractive Text Summarization
- URL: http://arxiv.org/abs/2003.13721v1
- Date: Mon, 30 Mar 2020 18:15:32 GMT
- Title: Amharic Abstractive Text Summarization
- Authors: Amr M. Zaki, Mahmoud I. Khalil, Hazem M. Abbas
- Abstract summary: Text Summarization is the task of condensing long text into just a handful of sentences.
In this work we discuss one of these new novel approaches which combines curriculum learning with Deep Learning, this model is called Scheduled Sampling.
We apply this work to one of the most widely spoken African languages which is the Amharic Language, as we try to enrich the African NLP community with top-notch Deep Learning architectures.
- Score: 0.6703429330486277
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Text Summarization is the task of condensing long text into just a handful of
sentences. Many approaches have been proposed for this task, some of the very
first were building statistical models (Extractive Methods) capable of
selecting important words and copying them to the output, however these models
lacked the ability to paraphrase sentences, as they simply select important
words without actually understanding their contexts nor understanding their
meaning, here comes the use of Deep Learning based architectures (Abstractive
Methods), which effectively tries to understand the meaning of sentences to
build meaningful summaries. In this work we discuss one of these new novel
approaches which combines curriculum learning with Deep Learning, this model is
called Scheduled Sampling. We apply this work to one of the most widely spoken
African languages which is the Amharic Language, as we try to enrich the
African NLP community with top-notch Deep Learning architectures.
Related papers
- Surveying the Landscape of Text Summarization with Deep Learning: A
Comprehensive Review [2.4185510826808487]
Deep learning has revolutionized natural language processing (NLP) by enabling the development of models that can learn complex representations of language data.
Deep learning models for NLP typically use large amounts of data to train deep neural networks, allowing them to learn the patterns and relationships in language data.
Applying deep learning to text summarization refers to the use of deep neural networks to perform text summarization tasks.
arXiv Detail & Related papers (2023-10-13T21:24:37Z) - Human Inspired Progressive Alignment and Comparative Learning for
Grounded Word Acquisition [6.47452771256903]
We take inspiration from how human babies acquire their first language, and developed a computational process for word acquisition through comparative learning.
Motivated by cognitive findings, we generated a small dataset that enables the computation models to compare the similarities and differences of various attributes.
We frame the acquisition of words as not only the information filtration process, but also as representation-symbol mapping.
arXiv Detail & Related papers (2023-07-05T19:38:04Z) - Uzbek text summarization based on TF-IDF [0.0]
This article presents an experiment on summarization task for Uzbek language.
The methodology was based on text abstracting based on TF-IDF algorithm.
We summarize the given text by applying the n-gram method to important parts of the whole text.
arXiv Detail & Related papers (2023-03-01T12:39:46Z) - Ensemble Transfer Learning for Multilingual Coreference Resolution [60.409789753164944]
A problem that frequently occurs when working with a non-English language is the scarcity of annotated training data.
We design a simple but effective ensemble-based framework that combines various transfer learning techniques.
We also propose a low-cost TL method that bootstraps coreference resolution models by utilizing Wikipedia anchor texts.
arXiv Detail & Related papers (2023-01-22T18:22:55Z) - Beyond Contrastive Learning: A Variational Generative Model for
Multilingual Retrieval [109.62363167257664]
We propose a generative model for learning multilingual text embeddings.
Our model operates on parallel data in $N$ languages.
We evaluate this method on a suite of tasks including semantic similarity, bitext mining, and cross-lingual question retrieval.
arXiv Detail & Related papers (2022-12-21T02:41:40Z) - Improving Keyphrase Extraction with Data Augmentation and Information
Filtering [67.43025048639333]
Keyphrase extraction is one of the essential tasks for document understanding in NLP.
We present a novel corpus and method for keyphrase extraction from the videos streamed on the Behance platform.
arXiv Detail & Related papers (2022-09-11T22:38:02Z) - Neural Abstractive Text Summarizer for Telugu Language [0.0]
The proposed architecture is based on encoder-decoder sequential models with attention mechanism.
We have applied this model on manually created dataset to generate a one sentence summary of the source text.
arXiv Detail & Related papers (2021-01-18T15:22:50Z) - SLM: Learning a Discourse Language Representation with Sentence
Unshuffling [53.42814722621715]
We introduce Sentence-level Language Modeling, a new pre-training objective for learning a discourse language representation.
We show that this feature of our model improves the performance of the original BERT by large margins.
arXiv Detail & Related papers (2020-10-30T13:33:41Z) - Abstractive Summarization of Spoken and Written Instructions with BERT [66.14755043607776]
We present the first application of the BERTSum model to conversational language.
We generate abstractive summaries of narrated instructional videos across a wide variety of topics.
We envision this integrated as a feature in intelligent virtual assistants, enabling them to summarize both written and spoken instructional content upon request.
arXiv Detail & Related papers (2020-08-21T20:59:34Z) - Exploring the Limits of Transfer Learning with a Unified Text-to-Text
Transformer [64.22926988297685]
Transfer learning, where a model is first pre-trained on a data-rich task before being fine-tuned on a downstream task, has emerged as a powerful technique in natural language processing (NLP)
In this paper, we explore the landscape of introducing transfer learning techniques for NLP by a unified framework that converts all text-based language problems into a text-to-text format.
arXiv Detail & Related papers (2019-10-23T17:37:36Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.