Unsupervised Extractive Summarization with Heterogeneous Graph
Embeddings for Chinese Document
- URL: http://arxiv.org/abs/2211.04698v1
- Date: Wed, 9 Nov 2022 06:07:31 GMT
- Title: Unsupervised Extractive Summarization with Heterogeneous Graph
Embeddings for Chinese Document
- Authors: Chen Lin, Ye Liu, Siyu An, Di Yin
- Abstract summary: We propose an unsupervised extractive summarizaiton method with heterogeneous graph embeddings (HGEs) for Chinese document.
Experimental results demonstrate that our method consistently outperforms the strong baseline in three summarization datasets.
- Score: 5.9630342951482085
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In the scenario of unsupervised extractive summarization, learning
high-quality sentence representations is essential to select salient sentences
from the input document. Previous studies focus more on employing statistical
approaches or pre-trained language models (PLMs) to extract sentence
embeddings, while ignoring the rich information inherent in the heterogeneous
types of interaction between words and sentences. In this paper, we are the
first to propose an unsupervised extractive summarizaiton method with
heterogeneous graph embeddings (HGEs) for Chinese document. A heterogeneous
text graph is constructed to capture different granularities of interactions by
incorporating graph structural information. Moreover, our proposed graph is
general and flexible where additional nodes such as keywords can be easily
integrated. Experimental results demonstrate that our method consistently
outperforms the strong baseline in three summarization datasets.
Related papers
- Subgraph Retrieval Enhanced by Graph-Text Alignment for Commonsense Question Answering [6.9841561321072465]
Commonsense question answering is a crucial task that requires machines to employ reasoning according to commonsense.
Previous studies predominantly employ an extracting-and-modeling paradigm to harness the information in KG.
We propose a novel framework: textbfSubgraph RtextbfEtrieval Enhanced by GratextbfPh-textbfText textbfAlignment, named textbfSEPTA.
arXiv Detail & Related papers (2024-11-11T10:57:31Z) - Scientific Paper Extractive Summarization Enhanced by Citation Graphs [50.19266650000948]
We focus on leveraging citation graphs to improve scientific paper extractive summarization under different settings.
Preliminary results demonstrate that citation graph is helpful even in a simple unsupervised framework.
Motivated by this, we propose a Graph-based Supervised Summarization model (GSS) to achieve more accurate results on the task when large-scale labeled data are available.
arXiv Detail & Related papers (2022-12-08T11:53:12Z) - Hierarchical Heterogeneous Graph Representation Learning for Short Text
Classification [60.233529926965836]
We propose a new method called SHINE, which is based on graph neural network (GNN) for short text classification.
First, we model the short text dataset as a hierarchical heterogeneous graph consisting of word-level component graphs.
Then, we dynamically learn a short document graph that facilitates effective label propagation among similar short texts.
arXiv Detail & Related papers (2021-10-30T05:33:05Z) - SgSum: Transforming Multi-document Summarization into Sub-graph
Selection [27.40759123902261]
Most existing extractive multi-document summarization (MDS) methods score each sentence individually and extract salient sentences one by one to compose a summary.
We propose a novel MDS framework (SgSum) to formulate the MDS task as a sub-graph selection problem.
Our model can produce significantly more coherent and informative summaries compared with traditional MDS methods.
arXiv Detail & Related papers (2021-10-25T05:12:10Z) - HETFORMER: Heterogeneous Transformer with Sparse Attention for Long-Text
Extractive Summarization [57.798070356553936]
HETFORMER is a Transformer-based pre-trained model with multi-granularity sparse attentions for extractive summarization.
Experiments on both single- and multi-document summarization tasks show that HETFORMER achieves state-of-the-art performance in Rouge F1.
arXiv Detail & Related papers (2021-10-12T22:42:31Z) - Multiplex Graph Neural Network for Extractive Text Summarization [34.185093491514394]
Extractive text summarization aims at extracting the most representative sentences from a given document as its summary.
We propose a novel Multiplex Graph Convolutional Network (Multi-GCN) to jointly model different types of relationships among sentences and words.
Based on Multi-GCN, we propose a Multiplex Graph Summarization (Multi-GraS) model for extractive text summarization.
arXiv Detail & Related papers (2021-08-29T16:11:01Z) - Relation Clustering in Narrative Knowledge Graphs [71.98234178455398]
relational sentences in the original text are embedded (with SBERT) and clustered in order to merge together semantically similar relations.
Preliminary tests show that such clustering might successfully detect similar relations, and provide a valuable preprocessing for semi-supervised approaches.
arXiv Detail & Related papers (2020-11-27T10:43:04Z) - Leveraging Graph to Improve Abstractive Multi-Document Summarization [50.62418656177642]
We develop a neural abstractive multi-document summarization (MDS) model which can leverage well-known graph representations of documents.
Our model utilizes graphs to encode documents in order to capture cross-document relations, which is crucial to summarizing long documents.
Our model can also take advantage of graphs to guide the summary generation process, which is beneficial for generating coherent and concise summaries.
arXiv Detail & Related papers (2020-05-20T13:39:47Z) - Heterogeneous Graph Neural Networks for Extractive Document
Summarization [101.17980994606836]
Cross-sentence relations are a crucial step in extractive document summarization.
We present a graph-based neural network for extractive summarization (HeterSumGraph)
We introduce different types of nodes into graph-based neural networks for extractive document summarization.
arXiv Detail & Related papers (2020-04-26T14:38:11Z) - Selective Attention Encoders by Syntactic Graph Convolutional Networks
for Document Summarization [21.351111598564987]
We propose a graph to connect the parsing trees from the sentences in a document and utilize the stacked graph convolutional networks (GCNs) to learn the syntactic representation for a document.
The proposed GCNs based selective attention approach outperforms the baselines and achieves the state-of-the-art performance on the dataset.
arXiv Detail & Related papers (2020-03-18T01:30:02Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.