Lightweight, Dynamic Graph Convolutional Networks for AMR-to-Text
Generation
- URL: http://arxiv.org/abs/2010.04383v1
- Date: Fri, 9 Oct 2020 06:03:46 GMT
- Title: Lightweight, Dynamic Graph Convolutional Networks for AMR-to-Text
Generation
- Authors: Yan Zhang, Zhijiang Guo, Zhiyang Teng, Wei Lu, Shay B. Cohen, Zuozhu
Liu, Lidong Bing
- Abstract summary: Lightweight Dynamic Graph Convolutional Networks (LDGCNs) are proposed.
LDGCNs capture richer non-local interactions by synthesizing higher order information from the input graphs.
We develop two novel parameter saving strategies based on the group graph convolutions and weight tied convolutions to reduce memory usage and model complexity.
- Score: 56.73834525802723
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: AMR-to-text generation is used to transduce Abstract Meaning Representation
structures (AMR) into text. A key challenge in this task is to efficiently
learn effective graph representations. Previously, Graph Convolution Networks
(GCNs) were used to encode input AMRs, however, vanilla GCNs are not able to
capture non-local information and additionally, they follow a local
(first-order) information aggregation scheme. To account for these issues,
larger and deeper GCN models are required to capture more complex interactions.
In this paper, we introduce a dynamic fusion mechanism, proposing Lightweight
Dynamic Graph Convolutional Networks (LDGCNs) that capture richer non-local
interactions by synthesizing higher order information from the input graphs. We
further develop two novel parameter saving strategies based on the group graph
convolutions and weight tied convolutions to reduce memory usage and model
complexity. With the help of these strategies, we are able to train a model
with fewer parameters while maintaining the model capacity. Experiments
demonstrate that LDGCNs outperform state-of-the-art models on two benchmark
datasets for AMR-to-text generation with significantly fewer parameters.
Related papers
- Language Models are Graph Learners [70.14063765424012]
Language Models (LMs) are challenging the dominance of domain-specific models, including Graph Neural Networks (GNNs) and Graph Transformers (GTs)
We propose a novel approach that empowers off-the-shelf LMs to achieve performance comparable to state-of-the-art GNNs on node classification tasks.
arXiv Detail & Related papers (2024-10-03T08:27:54Z) - A Pure Transformer Pretraining Framework on Text-attributed Graphs [50.833130854272774]
We introduce a feature-centric pretraining perspective by treating graph structure as a prior.
Our framework, Graph Sequence Pretraining with Transformer (GSPT), samples node contexts through random walks.
GSPT can be easily adapted to both node classification and link prediction, demonstrating promising empirical success on various datasets.
arXiv Detail & Related papers (2024-06-19T22:30:08Z) - Efficient Topology-aware Data Augmentation for High-Degree Graph Neural Networks [2.7523980737007414]
We propose TADA, an efficient and effective front-mounted data augmentation framework for graph neural networks (GNNs) on high-degree graphs (HDGs)
Under the hood, TADA includes two key modules: (i) feature expansion with structure embeddings, and (ii) topology- and attribute-aware graph sparsification.
TADA considerably improves the predictive performance of mainstream GNN models on 8 real homophilic/heterophilic HDGs in terms of node classification.
arXiv Detail & Related papers (2024-06-08T14:14:19Z) - Graph Mamba: Towards Learning on Graphs with State Space Models [2.039632659682125]
We present a new class of Graph Neural Networks (GNNs) based on selective State Space Models (SSMs)
GMNs attain an outstanding performance in long-range, small-scale, large-scale, and benchmark benchmark experiments.
arXiv Detail & Related papers (2024-02-13T18:58:17Z) - HINormer: Representation Learning On Heterogeneous Information Networks
with Graph Transformer [29.217820912610602]
Graph Transformers (GTs) have been proposed which work in the paradigm that allows message passing to a larger coverage even across the whole graph.
The investigation of GTs on heterogeneous information networks (HINs) is still under-exploited.
We propose a novel model named HINormer, which capitalizes on a larger-range aggregation mechanism for node representation learning.
arXiv Detail & Related papers (2023-02-22T12:25:07Z) - Neighborhood Convolutional Network: A New Paradigm of Graph Neural
Networks for Node Classification [12.062421384484812]
Graph Convolutional Network (GCN) decouples neighborhood aggregation and feature transformation in each convolutional layer.
In this paper, we propose a new paradigm of GCN, termed Neighborhood Convolutional Network (NCN)
In this way, the model could inherit the merit of decoupled GCN for aggregating neighborhood information, at the same time, develop much more powerful feature learning modules.
arXiv Detail & Related papers (2022-11-15T02:02:51Z) - Dynamic Graph Message Passing Networks for Visual Recognition [112.49513303433606]
Modelling long-range dependencies is critical for scene understanding tasks in computer vision.
A fully-connected graph is beneficial for such modelling, but its computational overhead is prohibitive.
We propose a dynamic graph message passing network, that significantly reduces the computational complexity.
arXiv Detail & Related papers (2022-09-20T14:41:37Z) - Data-Driven Learning of Geometric Scattering Networks [74.3283600072357]
We propose a new graph neural network (GNN) module based on relaxations of recently proposed geometric scattering transforms.
Our learnable geometric scattering (LEGS) module enables adaptive tuning of the wavelets to encourage band-pass features to emerge in learned representations.
arXiv Detail & Related papers (2020-10-06T01:20:27Z) - Hierarchical Message-Passing Graph Neural Networks [12.207978823927386]
We propose a novel Hierarchical Message-passing Graph Neural Networks framework.
Key idea is generating a hierarchical structure that re-organises all nodes in a flat graph into multi-level super graphs.
We present the first model to implement this framework, termed Hierarchical Community-aware Graph Neural Network (HC-GNN)
arXiv Detail & Related papers (2020-09-08T13:11:07Z) - Policy-GNN: Aggregation Optimization for Graph Neural Networks [60.50932472042379]
Graph neural networks (GNNs) aim to model the local graph structures and capture the hierarchical patterns by aggregating the information from neighbors.
It is a challenging task to develop an effective aggregation strategy for each node, given complex graphs and sparse features.
We propose Policy-GNN, a meta-policy framework that models the sampling procedure and message passing of GNNs into a combined learning process.
arXiv Detail & Related papers (2020-06-26T17:03:06Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.