Headed Span-Based Projective Dependency Parsing
- URL: http://arxiv.org/abs/2108.04750v1
- Date: Tue, 10 Aug 2021 15:27:47 GMT
- Title: Headed Span-Based Projective Dependency Parsing
- Authors: Songlin Yang, Kewei Tu
- Abstract summary: We propose a headed span-based method for projective dependency parsing.
We use neural networks to score headed spans and design a novel $O(n3)$ dynamic programming algorithm to enable global training and exact inference.
- Score: 24.337440797369702
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We propose a headed span-based method for projective dependency parsing. In a
projective tree, the subtree rooted at each word occurs in a contiguous
sequence (i.e., span) in the surface order, we call the span-headword pair
\textit{headed span}. In this view, a projective tree can be regarded as a
collection of headed spans. It is similar to the case in constituency parsing
since a constituency tree can be regarded as a collection of constituent spans.
Span-based methods decompose the score of a constituency tree sorely into the
score of constituent spans and use the CYK algorithm for global training and
exact inference, obtaining state-of-the-art results in constituency parsing.
Inspired by them, we decompose the score of a dependency tree into the score of
headed spans. We use neural networks to score headed spans and design a novel
$O(n^3)$ dynamic programming algorithm to enable global training and exact
inference. We evaluate our method on PTB, CTB, and UD, achieving
state-of-the-art or comparable results.
Related papers
- Order-sensitive Neural Constituency Parsing [9.858565876426411]
We propose a novel algorithm that improves on the previous neural span-based CKY decoder for constituency parsing.
In contrast to the traditional span-based decoding, we introduce an order-sensitive strategy, where the span combination scores are more carefully derived from an order-sensitive basis.
Our decoder can be regarded as a generalization over existing span-based decoder in determining a finer-grain scoring scheme for the combination of lower-level spans into higher-level spans.
arXiv Detail & Related papers (2022-11-01T12:31:30Z) - Biaffine Discourse Dependency Parsing [0.0]
We use the biaffine model for neural discourse dependency parsing and achieve significant performance improvement compared with the baselines.
We compare the Eisner algorithm and the Chu-Liu-Edmonds algorithm in the task and find that using the Chu-Liu-Edmonds generates deeper trees.
arXiv Detail & Related papers (2022-01-12T12:56:13Z) - Auto-Parsing Network for Image Captioning and Visual Question Answering [101.77688388554097]
We propose an Auto-Parsing Network (APN) to discover and exploit the input data's hidden tree structures.
Specifically, we impose a Probabilistic Graphical Model (PGM) parameterized by the attention operations on each self-attention layer to incorporate sparse assumption.
arXiv Detail & Related papers (2021-08-24T08:14:35Z) - Combining (second-order) graph-based and headed span-based projective
dependency parsing [24.337440797369702]
citetyang2021headed propose a headed span-based method. Both of them score all possible trees and globally find the highest-scoring tree.
In this paper, we combine these two kinds of methods, designing several dynamic programming algorithms for joint inference.
arXiv Detail & Related papers (2021-08-12T16:42:00Z) - Dependency Parsing as MRC-based Span-Span Prediction [29.956515394820673]
Higher-order methods for dependency parsing can partially but not fully address the issue that edges in dependency tree should be constructed at the text span/subtree level rather than word level.
We propose a new method for dependency parsing to address this issue.
arXiv Detail & Related papers (2021-05-17T08:03:48Z) - Text Information Aggregation with Centrality Attention [86.91922440508576]
We propose a new way of obtaining aggregation weights, called eigen-centrality self-attention.
We build a fully-connected graph for all the words in a sentence, then compute the eigen-centrality as the attention score of each word.
arXiv Detail & Related papers (2020-11-16T13:08:48Z) - Strongly Incremental Constituency Parsing with Graph Neural Networks [70.16880251349093]
Parsing sentences into syntax trees can benefit downstream applications in NLP.
Transition-baseds build trees by executing actions in a state transition system.
Existing transition-baseds are predominantly based on the shift-reduce transition system.
arXiv Detail & Related papers (2020-10-27T19:19:38Z) - Span-based Semantic Parsing for Compositional Generalization [53.24255235340056]
SpanBasedSP predicts a span tree over an input utterance, explicitly encoding how partial programs compose over spans in the input.
On GeoQuery, SCAN and CLOSURE, SpanBasedSP performs similarly to strong seq2seq baselines on random splits, but dramatically improves performance compared to baselines on splits that require compositional generalization.
arXiv Detail & Related papers (2020-09-13T16:42:18Z) - A Simple Global Neural Discourse Parser [61.728994693410954]
We propose a simple chart-based neural discourse that does not require any manually-crafted features and is based on learned span representations only.
We empirically demonstrate that our model achieves the best performance among globals, and comparable performance to state-of-art greedys.
arXiv Detail & Related papers (2020-09-02T19:28:40Z) - Linguistically Driven Graph Capsule Network for Visual Question
Reasoning [153.76012414126643]
We propose a hierarchical compositional reasoning model called the "Linguistically driven Graph Capsule Network"
The compositional process is guided by the linguistic parse tree. Specifically, we bind each capsule in the lowest layer to bridge the linguistic embedding of a single word in the original question with visual evidence.
Experiments on the CLEVR dataset, CLEVR compositional generation test, and FigureQA dataset demonstrate the effectiveness and composition generalization ability of our end-to-end model.
arXiv Detail & Related papers (2020-03-23T03:34:25Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.