VieSum: How Robust Are Transformer-based Models on Vietnamese
Summarization?
- URL: http://arxiv.org/abs/2110.04257v1
- Date: Fri, 8 Oct 2021 17:10:31 GMT
- Title: VieSum: How Robust Are Transformer-based Models on Vietnamese
Summarization?
- Authors: Hieu Nguyen, Long Phan, James Anibal, Alec Peltekian, Hieu Tran
- Abstract summary: We investigate the robustness of transformer-based encoder-decoder architectures for Vietnamese abstractive summarization.
We validate the performance of the methods on two Vietnamese datasets.
- Score: 1.1379578593538398
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Text summarization is a challenging task within natural language processing
that involves text generation from lengthy input sequences. While this task has
been widely studied in English, there is very limited research on summarization
for Vietnamese text. In this paper, we investigate the robustness of
transformer-based encoder-decoder architectures for Vietnamese abstractive
summarization. Leveraging transfer learning and self-supervised learning, we
validate the performance of the methods on two Vietnamese datasets.
Related papers
Err
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.