A Condense-then-Select Strategy for Text Summarization
- URL: http://arxiv.org/abs/2106.10468v1
- Date: Sat, 19 Jun 2021 10:33:10 GMT
- Title: A Condense-then-Select Strategy for Text Summarization
- Authors: Hou Pong Chan and Irwin King
- Abstract summary: We propose a novel condense-then-select framework for text summarization.
Our framework helps to avoid the loss of salient information, while preserving the high efficiency of sentence-level compression.
- Score: 53.10242552203694
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Select-then-compress is a popular hybrid, framework for text summarization
due to its high efficiency. This framework first selects salient sentences and
then independently condenses each of the selected sentences into a concise
version. However, compressing sentences separately ignores the context
information of the document, and is therefore prone to delete salient
information. To address this limitation, we propose a novel
condense-then-select framework for text summarization. Our framework first
concurrently condenses each document sentence. Original document sentences and
their compressed versions then become the candidates for extraction. Finally,
an extractor utilizes the context information of the document to select
candidates and assembles them into a summary. If salient information is deleted
during condensing, the extractor can select an original sentence to retain the
information. Thus, our framework helps to avoid the loss of salient
information, while preserving the high efficiency of sentence-level
compression. Experiment results on the CNN/DailyMail, DUC-2002, and Pubmed
datasets demonstrate that our framework outperforms the select-then-compress
framework and other strong baselines.
Related papers
- Incremental Extractive Opinion Summarization Using Cover Trees [81.59625423421355]
In online marketplaces user reviews accumulate over time, and opinion summaries need to be updated periodically.
In this work, we study the task of extractive opinion summarization in an incremental setting.
We present an efficient algorithm for accurately computing the CentroidRank summaries in an incremental setting.
arXiv Detail & Related papers (2024-01-16T02:00:17Z) - A General Contextualized Rewriting Framework for Text Summarization [15.311467109946571]
Exiting rewriting systems take each extractive sentence as the only input, which is relatively focused but can lose necessary background knowledge and discourse context.
We formalize contextualized rewriting as a seq2seq with group-tag alignments, identifying extractive sentences through content-based addressing.
Results show that our approach significantly outperforms non-contextualized rewriting systems without requiring reinforcement learning.
arXiv Detail & Related papers (2022-07-13T03:55:57Z) - A Survey on Neural Abstractive Summarization Methods and Factual
Consistency of Summarization [18.763290930749235]
summarization is the process of shortening a set of textual data computationally, to create a subset (a summary)
Existing summarization methods can be roughly divided into two types: extractive and abstractive.
An extractive summarizer explicitly selects text snippets from the source document, while an abstractive summarizer generates novel text snippets to convey the most salient concepts prevalent in the source.
arXiv Detail & Related papers (2022-04-20T14:56:36Z) - Topic Modeling Based Extractive Text Summarization [0.0]
We propose a novel method to summarize a text document by clustering its contents based on latent topics.
We utilize the lesser used and challenging WikiHow dataset in our approach to text summarization.
arXiv Detail & Related papers (2021-06-29T12:28:19Z) - Extractive Summarization of Call Transcripts [77.96603959765577]
This paper presents an indigenously developed method that combines topic modeling and sentence selection with punctuation restoration in ill-punctuated or un-punctuated call transcripts.
Extensive testing, evaluation and comparisons have demonstrated the efficacy of this summarizer for call transcript summarization.
arXiv Detail & Related papers (2021-03-19T02:40:59Z) - Unsupervised Summarization for Chat Logs with Topic-Oriented Ranking and
Context-Aware Auto-Encoders [59.038157066874255]
We propose a novel framework called RankAE to perform chat summarization without employing manually labeled data.
RankAE consists of a topic-oriented ranking strategy that selects topic utterances according to centrality and diversity simultaneously.
A denoising auto-encoder is designed to generate succinct but context-informative summaries based on the selected utterances.
arXiv Detail & Related papers (2020-12-14T07:31:17Z) - Extractive Summarization as Text Matching [123.09816729675838]
This paper creates a paradigm shift with regard to the way we build neural extractive summarization systems.
We formulate the extractive summarization task as a semantic text matching problem.
We have driven the state-of-the-art extractive result on CNN/DailyMail to a new level (44.41 in ROUGE-1)
arXiv Detail & Related papers (2020-04-19T08:27:57Z) - Selective Attention Encoders by Syntactic Graph Convolutional Networks
for Document Summarization [21.351111598564987]
We propose a graph to connect the parsing trees from the sentences in a document and utilize the stacked graph convolutional networks (GCNs) to learn the syntactic representation for a document.
The proposed GCNs based selective attention approach outperforms the baselines and achieves the state-of-the-art performance on the dataset.
arXiv Detail & Related papers (2020-03-18T01:30:02Z) - Learning to Select Bi-Aspect Information for Document-Scale Text Content
Manipulation [50.01708049531156]
We focus on a new practical task, document-scale text content manipulation, which is the opposite of text style transfer.
In detail, the input is a set of structured records and a reference text for describing another recordset.
The output is a summary that accurately describes the partial content in the source recordset with the same writing style of the reference.
arXiv Detail & Related papers (2020-02-24T12:52:10Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.