LAGr: Label Aligned Graphs for Better Systematic Generalization in
Semantic Parsing
- URL: http://arxiv.org/abs/2205.09607v1
- Date: Thu, 19 May 2022 15:01:37 GMT
- Title: LAGr: Label Aligned Graphs for Better Systematic Generalization in
Semantic Parsing
- Authors: Dora Jambor and Dzmitry Bahdanau
- Abstract summary: We show that better systematic generalization can be achieved by producing the meaning representation directly as a graph and not as a sequence.
We propose LAGr, a general framework to produce semantic parses by independently predicting node and edge labels for a complete multi-layer input-aligned graph.
Experiments demonstrate that LAGr achieves significant improvements in systematic generalization upon the baseline seq2seqs in both strongly- and weakly-supervised settings.
- Score: 7.2484012208081205
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Semantic parsing is the task of producing structured meaning representations
for natural language sentences. Recent research has pointed out that the
commonly-used sequence-to-sequence (seq2seq) semantic parsers struggle to
generalize systematically, i.e. to handle examples that require recombining
known knowledge in novel settings. In this work, we show that better systematic
generalization can be achieved by producing the meaning representation directly
as a graph and not as a sequence. To this end we propose LAGr (Label Aligned
Graphs), a general framework to produce semantic parses by independently
predicting node and edge labels for a complete multi-layer input-aligned graph.
The strongly-supervised LAGr algorithm requires aligned graphs as inputs,
whereas weakly-supervised LAGr infers alignments for originally unaligned
target graphs using approximate maximum-a-posteriori inference. Experiments
demonstrate that LAGr achieves significant improvements in systematic
generalization upon the baseline seq2seq parsers in both strongly- and
weakly-supervised settings.
Related papers
- Subgraph Aggregation for Out-of-Distribution Generalization on Graphs [29.884717215947745]
Out-of-distribution (OOD) generalization in Graph Neural Networks (GNNs) has gained significant attention.
We propose a novel framework, SubGraph Aggregation (SuGAr), designed to learn a diverse set of subgraphs.
Experiments on both synthetic and real-world datasets demonstrate that SuGAr outperforms state-of-the-art methods.
arXiv Detail & Related papers (2024-10-29T16:54:37Z) - A Pure Transformer Pretraining Framework on Text-attributed Graphs [50.833130854272774]
We introduce a feature-centric pretraining perspective by treating graph structure as a prior.
Our framework, Graph Sequence Pretraining with Transformer (GSPT), samples node contexts through random walks.
GSPT can be easily adapted to both node classification and link prediction, demonstrating promising empirical success on various datasets.
arXiv Detail & Related papers (2024-06-19T22:30:08Z) - Two Heads Are Better Than One: Boosting Graph Sparse Training via
Semantic and Topological Awareness [80.87683145376305]
Graph Neural Networks (GNNs) excel in various graph learning tasks but face computational challenges when applied to large-scale graphs.
We propose Graph Sparse Training ( GST), which dynamically manipulates sparsity at the data level.
GST produces a sparse graph with maximum topological integrity and no performance degradation.
arXiv Detail & Related papers (2024-02-02T09:10:35Z) - Stochastic Subgraph Neighborhood Pooling for Subgraph Classification [2.1270496914042996]
Subgraph Neighborhood Pooling (SSNP) jointly aggregates the subgraph and its neighborhood information without any computationally expensive operations such as labeling tricks.
Our experiments demonstrate that our models outperform current state-of-the-art methods (with a margin of up to 2%) while being up to 3X faster in training.
arXiv Detail & Related papers (2023-04-17T18:49:18Z) - Mind the Label Shift of Augmentation-based Graph OOD Generalization [88.32356432272356]
LiSA exploits textbfLabel-textbfinvariant textbfSubgraphs of the training graphs to construct textbfAugmented environments.
LiSA generates diverse augmented environments with a consistent predictive relationship.
Experiments on node-level and graph-level OOD benchmarks show that LiSA achieves impressive generalization performance with different GNN backbones.
arXiv Detail & Related papers (2023-03-27T00:08:45Z) - Finding Diverse and Predictable Subgraphs for Graph Domain
Generalization [88.32356432272356]
This paper focuses on out-of-distribution generalization on graphs where performance drops due to the unseen distribution shift.
We propose a new graph domain generalization framework, dubbed as DPS, by constructing multiple populations from the source domains.
Experiments on both node-level and graph-level benchmarks shows that the proposed DPS achieves impressive performance for various graph domain generalization tasks.
arXiv Detail & Related papers (2022-06-19T07:57:56Z) - Omni-Granular Ego-Semantic Propagation for Self-Supervised Graph
Representation Learning [6.128446481571702]
Unsupervised/self-supervised graph representation learning is critical for downstream node- and graph-level classification tasks.
We introduce instance-adaptive global-aware ego-semantic descriptors.
The descriptors can be explicitly integrated into local graph convolution as new neighbor nodes.
arXiv Detail & Related papers (2022-05-31T12:31:33Z) - LAGr: Labeling Aligned Graphs for Improving Systematic Generalization in
Semantic Parsing [6.846638912020957]
We show that better systematic generalization can be achieved by producing the meaning representation directly as a graph and not as a sequence.
We propose LAGr, the Labeling Aligned Graphs algorithm that produces semantic parses by predicting node and edge labels for a complete multi-layer input-aligned graph.
arXiv Detail & Related papers (2021-10-14T17:37:04Z) - Self-supervised Graph-level Representation Learning with Local and
Global Structure [71.45196938842608]
We propose a unified framework called Local-instance and Global-semantic Learning (GraphLoG) for self-supervised whole-graph representation learning.
Besides preserving the local similarities, GraphLoG introduces the hierarchical prototypes to capture the global semantic clusters.
An efficient online expectation-maximization (EM) algorithm is further developed for learning the model.
arXiv Detail & Related papers (2021-06-08T05:25:38Z) - Learning the Implicit Semantic Representation on Graph-Structured Data [57.670106959061634]
Existing representation learning methods in graph convolutional networks are mainly designed by describing the neighborhood of each node as a perceptual whole.
We propose a Semantic Graph Convolutional Networks (SGCN) that explores the implicit semantics by learning latent semantic-paths in graphs.
arXiv Detail & Related papers (2021-01-16T16:18:43Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.