Latent Tree Decomposition Parsers for AMR-to-Text Generation
- URL: http://arxiv.org/abs/2108.12304v1
- Date: Fri, 27 Aug 2021 14:30:35 GMT
- Title: Latent Tree Decomposition Parsers for AMR-to-Text Generation
- Authors: Lisa Jin, Daniel Gildea
- Abstract summary: By clustering edges into a hierarchy, a tree decomposition summarizes graph structure.
Our model encodes a forest of tree decompositions and extracts an expected tree.
It surpasses a convolutional baseline for molecular property prediction by 1.92% ROC-AUC.
- Score: 12.342043849587613
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Graph encoders in AMR-to-text generation models often rely on neighborhood
convolutions or global vertex attention. While these approaches apply to
general graphs, AMRs may be amenable to encoders that target their tree-like
structure. By clustering edges into a hierarchy, a tree decomposition
summarizes graph structure. Our model encodes a derivation forest of tree
decompositions and extracts an expected tree. From tree node embeddings, it
builds graph edge features used in vertex attention of the graph encoder.
Encoding TD forests instead of shortest-pairwise paths in a self-attentive
baseline raises BLEU by 0.7 and chrF++ by 0.3. The forest encoder also
surpasses a convolutional baseline for molecular property prediction by 1.92%
ROC-AUC.
Related papers
- From GNNs to Trees: Multi-Granular Interpretability for Graph Neural Networks [29.032055397116217]
Interpretable Graph Neural Networks (GNNs) aim to reveal the underlying reasoning behind model predictions.
Existing subgraph-based interpretable methods suffer from an overemphasis on local structure.
We introduce a novel Tree-like Interpretable Framework (TIF) for graph classification.
arXiv Detail & Related papers (2025-05-01T07:22:51Z) - Heterogeneous Graph Neural Network on Semantic Tree [11.810900066591861]
HetTree is a novel HGNN that models both the graph structure and heterogeneous aspects in a scalable and effective manner.
To effectively encode the semantic tree, HetTree uses a novel subtree attention mechanism to emphasize metapaths that are more helpful in encoding parent-child relationships.
Our evaluation of HetTree on a variety of real-world datasets demonstrates that it outperforms all existing baselines on open benchmarks.
arXiv Detail & Related papers (2024-02-21T03:14:45Z) - Deep Manifold Graph Auto-Encoder for Attributed Graph Embedding [51.75091298017941]
This paper proposes a novel Deep Manifold (Variational) Graph Auto-Encoder (DMVGAE/DMGAE) for attributed graph data.
The proposed method surpasses state-of-the-art baseline algorithms by a significant margin on different downstream tasks across popular datasets.
arXiv Detail & Related papers (2024-01-12T17:57:07Z) - Generative and Contrastive Paradigms Are Complementary for Graph
Self-Supervised Learning [56.45977379288308]
Masked autoencoder (MAE) learns to reconstruct masked graph edges or node features.
Contrastive Learning (CL) maximizes the similarity between augmented views of the same graph.
We propose graph contrastive masked autoencoder (GCMAE) framework to unify MAE and CL.
arXiv Detail & Related papers (2023-10-24T05:06:06Z) - TreeFormer: a Semi-Supervised Transformer-based Framework for Tree
Counting from a Single High Resolution Image [6.789370732159176]
Tree density estimation and counting using single aerial and satellite images is a challenging task in photogrammetry and remote sensing.
We propose the first semisupervised transformer-based framework for tree counting which reduces the expensive tree annotations for remote sensing images.
Our model was evaluated on two benchmark tree counting datasets, Jiangsu, and Yosemite, as well as a new dataset, KCL-London, created by ourselves.
arXiv Detail & Related papers (2023-07-12T12:19:36Z) - Graph Generation with $K^2$-trees [13.281380233427287]
We introduce a novel graph generation method leveraging $K2$-tree representation.
We also present a sequential $K2$-treerepresentation that incorporates pruning, flattening, and tokenization processes.
We extensively evaluate our algorithm on four general and two molecular graph datasets to confirm its superiority for graph generation.
arXiv Detail & Related papers (2023-05-30T15:36:37Z) - Hierarchical clustering with dot products recovers hidden tree structure [53.68551192799585]
In this paper we offer a new perspective on the well established agglomerative clustering algorithm, focusing on recovery of hierarchical structure.
We recommend a simple variant of the standard algorithm, in which clusters are merged by maximum average dot product and not, for example, by minimum distance or within-cluster variance.
We demonstrate that the tree output by this algorithm provides a bona fide estimate of generative hierarchical structure in data, under a generic probabilistic graphical model.
arXiv Detail & Related papers (2023-05-24T11:05:12Z) - MGAE: Masked Autoencoders for Self-Supervised Learning on Graphs [55.66953093401889]
Masked graph autoencoder (MGAE) framework to perform effective learning on graph structure data.
Taking insights from self-supervised learning, we randomly mask a large proportion of edges and try to reconstruct these missing edges during training.
arXiv Detail & Related papers (2022-01-07T16:48:07Z) - Tree Decomposition Attention for AMR-to-Text Generation [12.342043849587613]
We use a graph's tree decomposition to constrain self-attention in a graph.
We apply dynamic programming to derive a forest of tree decompositions, choosing the most structurally similar tree to the AMR.
Our system outperforms a self-attentive baseline by 1.6 BLEU and 1.8 chrF++.
arXiv Detail & Related papers (2021-08-27T14:24:25Z) - TD-GEN: Graph Generation With Tree Decomposition [31.751200416677225]
TD-GEN is a graph generation framework based on tree decomposition.
Tree nodes are supernodes, each representing a cluster of nodes in the graph.
arXiv Detail & Related papers (2021-06-20T08:57:43Z) - Neural Trees for Learning on Graphs [19.05038106825347]
Graph Neural Networks (GNNs) have emerged as a flexible and powerful approach for learning over graphs.
We propose a new GNN architecture -- the Neural Tree.
We show that the neural tree architecture can approximate any smooth probability distribution function over an undirected graph.
arXiv Detail & Related papers (2021-05-15T17:08:20Z) - Visualizing hierarchies in scRNA-seq data using a density tree-biased
autoencoder [50.591267188664666]
We propose an approach for identifying a meaningful tree structure from high-dimensional scRNA-seq data.
We then introduce DTAE, a tree-biased autoencoder that emphasizes the tree structure of the data in low dimensional space.
arXiv Detail & Related papers (2021-02-11T08:48:48Z) - SGA: A Robust Algorithm for Partial Recovery of Tree-Structured
Graphical Models with Noisy Samples [75.32013242448151]
We consider learning Ising tree models when the observations from the nodes are corrupted by independent but non-identically distributed noise.
Katiyar et al. (2020) showed that although the exact tree structure cannot be recovered, one can recover a partial tree structure.
We propose Symmetrized Geometric Averaging (SGA), a more statistically robust algorithm for partial tree recovery.
arXiv Detail & Related papers (2021-01-22T01:57:35Z) - Uncovering the Folding Landscape of RNA Secondary Structure with Deep
Graph Embeddings [71.20283285671461]
We propose a geometric scattering autoencoder (GSAE) network for learning such graph embeddings.
Our embedding network first extracts rich graph features using the recently proposed geometric scattering transform.
We show that GSAE organizes RNA graphs both by structure and energy, accurately reflecting bistable RNA structures.
arXiv Detail & Related papers (2020-06-12T00:17:59Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.