An entity-guided text summarization framework with relational
heterogeneous graph neural network
- URL: http://arxiv.org/abs/2302.03205v1
- Date: Tue, 7 Feb 2023 02:27:21 GMT
- Title: An entity-guided text summarization framework with relational
heterogeneous graph neural network
- Authors: Jingqiang Chen
- Abstract summary: Two crucial issues for text summarization are to make use of knowledge beyond text and to make use of cross-sentence relations in text.
This paper focuses on two issues by leveraging entities mentioned in text to connect GNN and KG for summarization.
- Score: 0.76146285961466
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Two crucial issues for text summarization to generate faithful summaries are
to make use of knowledge beyond text and to make use of cross-sentence
relations in text. Intuitive ways for the two issues are Knowledge Graph (KG)
and Graph Neural Network (GNN) respectively. Entities are semantic units in
text and in KG. This paper focuses on both issues by leveraging entities
mentioned in text to connect GNN and KG for summarization. Firstly, entities
are leveraged to construct a sentence-entity graph with weighted multi-type
edges to model sentence relations, and a relational heterogeneous GNN for
summarization is proposed to calculate node encodings. Secondly, entities are
leveraged to link the graph to KG to collect knowledge. Thirdly, entities guide
a two-step summarization framework defining a multi-task selector to select
salient sentences and entities, and using an entity-focused abstractor to
compress the sentences. GNN is connected with KG by constructing
sentence-entity graphs where entity-entity edges are built based on KG,
initializing entity embeddings on KG, and training entity embeddings using
entity-entity edges. The relational heterogeneous GNN utilizes both edge
weights and edge types in GNN to calculate graphs with weighted multi-type
edges. Experiments show the proposed method outperforms extractive baselines
including the HGNN-based HGNNSum and abstractive baselines including the
entity-driven SENECA on CNN/DM, and outperforms most baselines on NYT50.
Experiments on sub-datasets show the density of sentence-entity edges greatly
influences the performance of the proposed method. The greater the density, the
better the performance. Ablations show effectiveness of the method.
Related papers
- EntailE: Introducing Textual Entailment in Commonsense Knowledge Graph
Completion [54.12709176438264]
Commonsense knowledge graphs (CSKGs) utilize free-form text to represent named entities, short phrases, and events as their nodes.
Current methods leverage semantic similarities to increase the graph density, but the semantic plausibility of the nodes and their relations are under-explored.
We propose to adopt textual entailment to find implicit entailment relations between CSKG nodes, to effectively densify the subgraph connecting nodes within the same conceptual class.
arXiv Detail & Related papers (2024-02-15T02:27:23Z) - Article Classification with Graph Neural Networks and Multigraphs [0.12499537119440243]
We propose a method to enhance the performance of article classification by enriching simple Graph Neural Network (GNN) pipelines with multi-graph representations.
fully supervised transductive node classification experiments are conducted on the Open Graph Benchmark OGBN-arXiv dataset and the PubMed diabetes dataset.
Results demonstrate that multi-graphs consistently improve the performance of a variety of GNN models compared to the default graphs.
arXiv Detail & Related papers (2023-09-20T14:18:04Z) - Explainable Sparse Knowledge Graph Completion via High-order Graph
Reasoning Network [111.67744771462873]
This paper proposes a novel explainable model for sparse Knowledge Graphs (KGs)
It combines high-order reasoning into a graph convolutional network, namely HoGRN.
It can not only improve the generalization ability to mitigate the information insufficiency issue but also provide interpretability.
arXiv Detail & Related papers (2022-07-14T10:16:56Z) - SubGraph Networks based Entity Alignment for Cross-lingual Knowledge
Graph [7.892065498202909]
We introduce the subgraph network (SGN) method into the GCN-based cross-lingual KG entity alignment method.
Experiments show that the proposed method outperforms the state-of-the-art GCN-based method.
arXiv Detail & Related papers (2022-05-07T05:13:15Z) - Training Free Graph Neural Networks for Graph Matching [103.45755859119035]
TFGM is a framework to boost the performance of Graph Neural Networks (GNNs) based graph matching without training.
Applying TFGM on various GNNs shows promising improvements over baselines.
arXiv Detail & Related papers (2022-01-14T09:04:46Z) - CoRGi: Content-Rich Graph Neural Networks with Attention [12.339385456449659]
We present CoRGi, a graph neural network that considers the rich data within nodes in the context of their neighbors.
This is achieved by endowing CoRGi's message passing with a personalized attention mechanism over the content of each node.
We show that CoRGi is better at making edge-value predictions over existing methods, especially on sparse regions of the graph.
arXiv Detail & Related papers (2021-10-10T17:54:30Z) - Learning Intents behind Interactions with Knowledge Graph for
Recommendation [93.08709357435991]
Knowledge graph (KG) plays an increasingly important role in recommender systems.
Existing GNN-based models fail to identify user-item relation at a fine-grained level of intents.
We propose a new model, Knowledge Graph-based Intent Network (KGIN)
arXiv Detail & Related papers (2021-02-14T03:21:36Z) - Factorizable Graph Convolutional Networks [90.59836684458905]
We introduce a novel graph convolutional network (GCN) that explicitly disentangles intertwined relations encoded in a graph.
FactorGCN takes a simple graph as input, and disentangles it into several factorized graphs.
We evaluate the proposed FactorGCN both qualitatively and quantitatively on the synthetic and real-world datasets.
arXiv Detail & Related papers (2020-10-12T03:01:40Z) - Improving Coreference Resolution by Leveraging Entity-Centric Features
with Graph Neural Networks and Second-order Inference [12.115691569576345]
Coreferent mentions usually spread far apart in an entire text, making it difficult to incorporate entity-level features.
We propose a graph neural network-based coreference resolution method that can capture the entity-centric information.
A global inference algorithm up to second-order features is also presented to optimally cluster mentions into consistent groups.
arXiv Detail & Related papers (2020-09-10T02:22:21Z) - Iterative Context-Aware Graph Inference for Visual Dialog [126.016187323249]
We propose a novel Context-Aware Graph (CAG) neural network.
Each node in the graph corresponds to a joint semantic feature, including both object-based (visual) and history-related (textual) context representations.
arXiv Detail & Related papers (2020-04-05T13:09:37Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.