Text Information Aggregation with Centrality Attention
- URL: http://arxiv.org/abs/2011.07916v1
- Date: Mon, 16 Nov 2020 13:08:48 GMT
- Title: Text Information Aggregation with Centrality Attention
- Authors: Jingjing Gong, Hang Yan, Yining Zheng, Xipeng Qiu and Xuanjing Huang
- Abstract summary: We propose a new way of obtaining aggregation weights, called eigen-centrality self-attention.
We build a fully-connected graph for all the words in a sentence, then compute the eigen-centrality as the attention score of each word.
- Score: 86.91922440508576
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: A lot of natural language processing problems need to encode the text
sequence as a fix-length vector, which usually involves aggregation process of
combining the representations of all the words, such as pooling or
self-attention. However, these widely used aggregation approaches did not take
higher-order relationship among the words into consideration. Hence we propose
a new way of obtaining aggregation weights, called eigen-centrality
self-attention. More specifically, we build a fully-connected graph for all the
words in a sentence, then compute the eigen-centrality as the attention score
of each word.
The explicit modeling of relationships as a graph is able to capture some
higher-order dependency among words, which helps us achieve better results in 5
text classification tasks and one SNLI task than baseline models such as
pooling, self-attention and dynamic routing. Besides, in order to compute the
dominant eigenvector of the graph, we adopt power method algorithm to get the
eigen-centrality measure. Moreover, we also derive an iterative approach to get
the gradient for the power method process to reduce both memory consumption and
computation requirement.}
Related papers
- Ensemble Quadratic Assignment Network for Graph Matching [52.20001802006391]
Graph matching is a commonly used technique in computer vision and pattern recognition.
Recent data-driven approaches have improved the graph matching accuracy remarkably.
We propose a graph neural network (GNN) based approach to combine the advantages of data-driven and traditional methods.
arXiv Detail & Related papers (2024-03-11T06:34:05Z) - Conversational Semantic Parsing using Dynamic Context Graphs [68.72121830563906]
We consider the task of conversational semantic parsing over general purpose knowledge graphs (KGs) with millions of entities, and thousands of relation-types.
We focus on models which are capable of interactively mapping user utterances into executable logical forms.
arXiv Detail & Related papers (2023-05-04T16:04:41Z) - ClusterFuG: Clustering Fully connected Graphs by Multicut [20.254912065749956]
In dense multicut, the clustering objective is given in a factorized form as inner products of node feature vectors.
We show how to rewrite classical greedy algorithms for multicut in our dense setting and how to modify them for greater efficiency and solution quality.
arXiv Detail & Related papers (2023-01-28T11:10:50Z) - Dynamic Graph Message Passing Networks for Visual Recognition [112.49513303433606]
Modelling long-range dependencies is critical for scene understanding tasks in computer vision.
A fully-connected graph is beneficial for such modelling, but its computational overhead is prohibitive.
We propose a dynamic graph message passing network, that significantly reduces the computational complexity.
arXiv Detail & Related papers (2022-09-20T14:41:37Z) - Towards Efficient Scene Understanding via Squeeze Reasoning [71.1139549949694]
We propose a novel framework called Squeeze Reasoning.
Instead of propagating information on the spatial map, we first learn to squeeze the input feature into a channel-wise global vector.
We show that our approach can be modularized as an end-to-end trained block and can be easily plugged into existing networks.
arXiv Detail & Related papers (2020-11-06T12:17:01Z) - Improving Coreference Resolution by Leveraging Entity-Centric Features
with Graph Neural Networks and Second-order Inference [12.115691569576345]
Coreferent mentions usually spread far apart in an entire text, making it difficult to incorporate entity-level features.
We propose a graph neural network-based coreference resolution method that can capture the entity-centric information.
A global inference algorithm up to second-order features is also presented to optimally cluster mentions into consistent groups.
arXiv Detail & Related papers (2020-09-10T02:22:21Z) - Gossip and Attend: Context-Sensitive Graph Representation Learning [0.5493410630077189]
Graph representation learning (GRL) is a powerful technique for learning low-dimensional vector representation of high-dimensional and often sparse graphs.
We propose GOAT, a context-sensitive algorithm inspired by gossip communication and a mutual attention mechanism simply over the structure of the graph.
arXiv Detail & Related papers (2020-03-30T18:23:26Z) - GATCluster: Self-Supervised Gaussian-Attention Network for Image
Clustering [9.722607434532883]
We propose a self-supervised clustering network for image Clustering (GATCluster)
Rather than extracting intermediate features first and then performing the traditional clustering, GATCluster semantic cluster labels without further post-processing.
We develop a two-step learning algorithm that is memory-efficient for clustering large-size images.
arXiv Detail & Related papers (2020-02-27T00:57:18Z) - Quantized Decentralized Stochastic Learning over Directed Graphs [52.94011236627326]
We consider a decentralized learning problem where data points are distributed among computing nodes communicating over a directed graph.
As the model size gets large, decentralized learning faces a major bottleneck that is the communication load due to each node transmitting messages (model updates) to its neighbors.
We propose the quantized decentralized learning algorithm over directed graphs that is based on the push-sum algorithm in decentralized consensus optimization.
arXiv Detail & Related papers (2020-02-23T18:25:39Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.