A Unifying Theory of Transition-based and Sequence Labeling Parsing
- URL: http://arxiv.org/abs/2011.00584v1
- Date: Sun, 1 Nov 2020 18:25:15 GMT
- Title: A Unifying Theory of Transition-based and Sequence Labeling Parsing
- Authors: Carlos G\'omez-Rodr\'iguez, Michalina Strzyz, David Vilares
- Abstract summary: We map transition-based parsing algorithms that read sentences from left to right to sequence labeling encodings of syntactic trees.
This establishes a theoretical relation between transition-based parsing and sequence-labeling parsing.
We implement sequence labeling versions of four algorithms, showing that they are learnable and obtain comparable performance to existing encodings.
- Score: 14.653008985229617
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We define a mapping from transition-based parsing algorithms that read
sentences from left to right to sequence labeling encodings of syntactic trees.
This not only establishes a theoretical relation between transition-based
parsing and sequence-labeling parsing, but also provides a method to obtain new
encodings for fast and simple sequence labeling parsing from the many existing
transition-based parsers for different formalisms. Applying it to dependency
parsing, we implement sequence labeling versions of four algorithms, showing
that they are learnable and obtain comparable performance to existing
encodings.
Related papers
- Tokenization as Finite-State Transduction [24.19959327497118]
We introduce a finite-state framework which can efficiently encode all possible tokenizations of a regular language.
We show that Byte-Pair.
Match (BPE) and MaxPiece (WordPiece) fit within this framework.
An application of this is to guided generation, where the outputs of a language model are constrained to match some pattern.
arXiv Detail & Related papers (2024-10-21T07:10:07Z) - On the Challenges of Fully Incremental Neural Dependency Parsing [7.466159270333272]
Since the popularization of BiLSTMs and Transformer-based bidirectional encoders, state-of-the-art syntactics have lacked incrementality.
This paper explores whether fully incremental dependency parsing with modern architectures can be competitive.
We build bidirectionals combining strictly left-to-right neural encoders with fully incremental sequence-labeling and transition-based decoders.
arXiv Detail & Related papers (2023-09-28T08:44:08Z) - Linear-Time Modeling of Linguistic Structure: An Order-Theoretic
Perspective [97.57162770792182]
Tasks that model the relation between pairs of tokens in a string are a vital part of understanding natural language.
We show that these exhaustive comparisons can be avoided, and, moreover, the complexity can be reduced to linear by casting the relation between tokens as a partial order over the string.
Our method predicts real numbers for each token in a string in parallel and sorts the tokens accordingly, resulting in total orders of the tokens in the string.
arXiv Detail & Related papers (2023-05-24T11:47:35Z) - On Parsing as Tagging [66.31276017088477]
We show how to reduce tetratagging, a state-of-the-art constituency tagger, to shift--reduce parsing.
We empirically evaluate our taxonomy of tagging pipelines with different choices of linearizers, learners, and decoders.
arXiv Detail & Related papers (2022-11-14T13:37:07Z) - Graph-Based Decoding for Task Oriented Semantic Parsing [16.054030490095464]
We formulate semantic parsing as a dependency parsing task, applying graph-based decoding techniques developed for syntactic parsing.
We find that our graph-based approach is competitive with sequence decoders on the standard setting, and offers significant improvements in data efficiency and settings where partially-annotated data is available.
arXiv Detail & Related papers (2021-09-09T23:22:09Z) - Dependency Parsing with Bottom-up Hierarchical Pointer Networks [0.7412445894287709]
Left-to-right and top-down transition-based algorithms are among the most accurate approaches for performing dependency parsing.
We propose two novel transition-based alternatives: an approach that parses a sentence in right-to-left order and a variant that does it from the outside in.
We empirically test the proposed neural architecture with the different algorithms on a wide variety of languages, outperforming the original approach in practically all of them.
arXiv Detail & Related papers (2021-05-20T09:10:42Z) - Syntactic representation learning for neural network based TTS with
syntactic parse tree traversal [49.05471750563229]
We propose a syntactic representation learning method based on syntactic parse tree to automatically utilize the syntactic structure information.
Experimental results demonstrate the effectiveness of our proposed approach.
For sentences with multiple syntactic parse trees, prosodic differences can be clearly perceived from the synthesized speeches.
arXiv Detail & Related papers (2020-12-13T05:52:07Z) - Fast semantic parsing with well-typedness guarantees [78.76675218975768]
AM dependency parsing is a principled method for neural semantic parsing with high accuracy across multiple graphbanks.
We describe an A* and a transition-based for AM dependency parsing which guarantee well-typedness and improve parsing speed by up to 3 orders of magnitude.
arXiv Detail & Related papers (2020-09-15T21:54:01Z) - 2kenize: Tying Subword Sequences for Chinese Script Conversion [54.33749520569979]
We propose a model that can disambiguate between mappings and convert between the two scripts.
Our proposed method outperforms previous Chinese Character conversion approaches by 6 points in accuracy.
arXiv Detail & Related papers (2020-05-07T10:53:05Z) - Multi-level Head-wise Match and Aggregation in Transformer for Textual
Sequence Matching [87.97265483696613]
We propose a new approach to sequence pair matching with Transformer, by learning head-wise matching representations on multiple levels.
Experiments show that our proposed approach can achieve new state-of-the-art performance on multiple tasks.
arXiv Detail & Related papers (2020-01-20T20:02:02Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.