MrGCN: Mirror Graph Convolution Network for Relation Extraction with
Long-Term Dependencies
- URL: http://arxiv.org/abs/2101.00124v1
- Date: Fri, 1 Jan 2021 00:52:53 GMT
- Title: MrGCN: Mirror Graph Convolution Network for Relation Extraction with
Long-Term Dependencies
- Authors: Xiao Guo, I-Hung Hsu, Wael AbdAlmageed, Premkumar Natarajan, Nanyun
Peng
- Abstract summary: In relation extraction, dependency trees that contain rich syntactic clues have been widely used to help capture long-term dependencies in text.
We propose the Mirror Graph Convolution Network (MrGCN), a GNN model with pooling-unpooling structures tailored to relation extraction.
Experiments on two datasets demonstrate the effectiveness of our method, showing significant improvements over previous results.
- Score: 32.27755470353054
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: The ability to capture complex linguistic structures and long-term
dependencies among words in the passage is essential for many natural language
understanding tasks. In relation extraction, dependency trees that contain rich
syntactic clues have been widely used to help capture long-term dependencies in
text. Graph neural networks (GNNs), one of the means to encode dependency
graphs, has been shown effective in several prior works. However, relatively
little attention has been paid to the receptive fields of GNNs, which can be
crucial in tasks with extremely long text that go beyond single sentences and
require discourse analysis. In this work, we leverage the idea of graph pooling
and propose the Mirror Graph Convolution Network (MrGCN), a GNN model with
pooling-unpooling structures tailored to relation extraction. The pooling
branch reduces the graph size and enables the GCN to obtain larger receptive
fields within less layers; the unpooling branch restores the pooled graph to
its original resolution such that token-level relation extraction can be
performed. Experiments on two datasets demonstrate the effectiveness of our
method, showing significant improvements over previous results.
Related papers
- Relating-Up: Advancing Graph Neural Networks through Inter-Graph Relationships [17.978546172777342]
Graph Neural Networks (GNNs) have excelled in learning from graph-structured data.
Despite their successes, GNNs are limited by neglecting the context of relationships across graphs.
We introduce Relating-Up, a plug-and-play module that enhances GNNs by exploiting inter-graph relationships.
arXiv Detail & Related papers (2024-05-07T02:16:54Z) - DGNN: Decoupled Graph Neural Networks with Structural Consistency
between Attribute and Graph Embedding Representations [62.04558318166396]
Graph neural networks (GNNs) demonstrate a robust capability for representation learning on graphs with complex structures.
A novel GNNs framework, dubbed Decoupled Graph Neural Networks (DGNN), is introduced to obtain a more comprehensive embedding representation of nodes.
Experimental results conducted on several graph benchmark datasets verify DGNN's superiority in node classification task.
arXiv Detail & Related papers (2024-01-28T06:43:13Z) - NodeFormer: A Scalable Graph Structure Learning Transformer for Node
Classification [70.51126383984555]
We introduce a novel all-pair message passing scheme for efficiently propagating node signals between arbitrary nodes.
The efficient computation is enabled by a kernerlized Gumbel-Softmax operator.
Experiments demonstrate the promising efficacy of the method in various tasks including node classification on graphs.
arXiv Detail & Related papers (2023-06-14T09:21:15Z) - Search to Capture Long-range Dependency with Stacking GNNs for Graph
Classification [41.84399177525008]
shallow GNNs are more common due to the well-known over-smoothing problem facing deeper GNNs.
We propose a novel approach with the help of neural architecture search (NAS), which is dubbed LRGNN (Long-Range Graph Neural Networks)
arXiv Detail & Related papers (2023-02-17T03:40:17Z) - MGNNI: Multiscale Graph Neural Networks with Implicit Layers [53.75421430520501]
implicit graph neural networks (GNNs) have been proposed to capture long-range dependencies in underlying graphs.
We introduce and justify two weaknesses of implicit GNNs: the constrained expressiveness due to their limited effective range for capturing long-range dependencies, and their lack of ability to capture multiscale information on graphs at multiple resolutions.
We propose a multiscale graph neural network with implicit layers (MGNNI) which is able to model multiscale structures on graphs and has an expanded effective range for capturing long-range dependencies.
arXiv Detail & Related papers (2022-10-15T18:18:55Z) - Affinity-Aware Graph Networks [9.888383815189176]
Graph Neural Networks (GNNs) have emerged as a powerful technique for learning on relational data.
We explore the use of affinity measures as features in graph neural networks.
We propose message passing networks based on these features and evaluate their performance on a variety of node and graph property prediction tasks.
arXiv Detail & Related papers (2022-06-23T18:51:35Z) - Automatic Relation-aware Graph Network Proliferation [182.30735195376792]
We propose Automatic Relation-aware Graph Network Proliferation (ARGNP) for efficiently searching GNNs.
These operations can extract hierarchical node/relational information and provide anisotropic guidance for message passing on a graph.
Experiments on six datasets for four graph learning tasks demonstrate that GNNs produced by our method are superior to the current state-of-the-art hand-crafted and search-based GNNs.
arXiv Detail & Related papers (2022-05-31T10:38:04Z) - Hierarchical graph neural nets can capture long-range interactions [8.067880298298185]
We study hierarchical message passing models that leverage a multi-resolution representation of a given graph.
This facilitates learning of features that span large receptive fields without loss of local information.
We introduce Hierarchical Graph Net (HGNet), which for any two connected nodes guarantees existence of message-passing paths of at most logarithmic length.
arXiv Detail & Related papers (2021-07-15T16:24:22Z) - Reinforced Neighborhood Selection Guided Multi-Relational Graph Neural
Networks [68.9026534589483]
RioGNN is a novel Reinforced, recursive and flexible neighborhood selection guided multi-relational Graph Neural Network architecture.
RioGNN can learn more discriminative node embedding with enhanced explainability due to the recognition of individual importance of each relation.
arXiv Detail & Related papers (2021-04-16T04:30:06Z) - Counting Substructures with Higher-Order Graph Neural Networks:
Possibility and Impossibility Results [58.277290855841976]
We study tradeoffs of computational cost and expressive power of Graph Neural Networks (GNNs)
We show that a new model can count subgraphs of size $k$, and thereby overcomes a known limitation of low-order GNNs.
In several cases, the proposed algorithm can greatly reduce computational complexity compared to the existing higher-order $k$-GNNs.
arXiv Detail & Related papers (2020-12-06T03:42:54Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.