Transition-Based Dependency Parsing using Perceptron Learner
- URL: http://arxiv.org/abs/2001.08279v2
- Date: Tue, 28 Jan 2020 22:09:19 GMT
- Title: Transition-Based Dependency Parsing using Perceptron Learner
- Authors: Rahul Radhakrishnan Iyer, Miguel Ballesteros, Chris Dyer, Robert
Frederking
- Abstract summary: We tackle transition-based dependency parsing using a Perceptron Learner.
Our proposed model, which adds more relevant features to the Perceptron Learner, outperforms a baseline arc-standard.
- Score: 34.59241394911966
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Syntactic parsing using dependency structures has become a standard technique
in natural language processing with many different parsing models, in
particular data-driven models that can be trained on syntactically annotated
corpora. In this paper, we tackle transition-based dependency parsing using a
Perceptron Learner. Our proposed model, which adds more relevant features to
the Perceptron Learner, outperforms a baseline arc-standard parser. We beat the
UAS of the MALT and LSTM parsers. We also give possible ways to address parsing
of non-projective trees.
Related papers
- ChatGPT is a Potential Zero-Shot Dependency Parser [5.726114645714751]
It remains an understudied question whether pre-trained language models can spontaneously exhibit the ability of dependency parsing without introducing additional structure in the zero-shot scenario.
In this paper, we propose to explore the dependency parsing ability of large language models such as ChatGPT and conduct linguistic analysis.
arXiv Detail & Related papers (2023-10-25T14:08:39Z) - Hexatagging: Projective Dependency Parsing as Tagging [63.5392760743851]
We introduce a novel dependency, the hexatagger, that constructs dependency trees by tagging the words in a sentence with elements from a finite set of possible tags.
Our approach is fully parallelizable at training time, i.e., the structure-building actions needed to build a dependency parse can be predicted in parallel to each other.
We achieve state-of-the-art performance of 96.4 LAS and 97.4 UAS on the Penn Treebank test set.
arXiv Detail & Related papers (2023-06-08T18:02:07Z) - Syntactic Substitutability as Unsupervised Dependency Syntax [31.488677474152794]
We model a more general property implicit in the definition of dependency relations, syntactic substitutability.
This property captures the fact that words at either end of a dependency can be substituted with words from the same category.
We show that increasing the number of substitutions used improves parsing accuracy on natural data.
arXiv Detail & Related papers (2022-11-29T09:01:37Z) - On The Ingredients of an Effective Zero-shot Semantic Parser [95.01623036661468]
We analyze zero-shot learning by paraphrasing training examples of canonical utterances and programs from a grammar.
We propose bridging these gaps using improved grammars, stronger paraphrasers, and efficient learning methods.
Our model achieves strong performance on two semantic parsing benchmarks (Scholar, Geo) with zero labeled data.
arXiv Detail & Related papers (2021-10-15T21:41:16Z) - Learning compositional structures for semantic graph parsing [81.41592892863979]
We show how AM dependency parsing can be trained directly on a neural latent-variable model.
Our model picks up on several linguistic phenomena on its own and achieves comparable accuracy to supervised training.
arXiv Detail & Related papers (2021-06-08T14:20:07Z) - Learning to Synthesize Data for Semantic Parsing [57.190817162674875]
We propose a generative model which models the composition of programs and maps a program to an utterance.
Due to the simplicity of PCFG and pre-trained BART, our generative model can be efficiently learned from existing data at hand.
We evaluate our method in both in-domain and out-of-domain settings of text-to-Query parsing on the standard benchmarks of GeoQuery and Spider.
arXiv Detail & Related papers (2021-04-12T21:24:02Z) - Coordinate Constructions in English Enhanced Universal Dependencies:
Analysis and Computational Modeling [1.9950682531209154]
We address the representation of coordinate constructions in Enhanced Universal Dependencies (UD)
We create a large-scale dataset of manually edited syntax graphs.
We identify several systematic errors in the original data, and propose to also propagate adjuncts.
arXiv Detail & Related papers (2021-03-16T10:24:27Z) - Infusing Finetuning with Semantic Dependencies [62.37697048781823]
We show that, unlike syntax, semantics is not brought to the surface by today's pretrained models.
We then use convolutional graph encoders to explicitly incorporate semantic parses into task-specific finetuning.
arXiv Detail & Related papers (2020-12-10T01:27:24Z) - Applying Occam's Razor to Transformer-Based Dependency Parsing: What
Works, What Doesn't, and What is Really Necessary [9.347252855045125]
We study the choice of pre-trained embeddings and whether they use LSTM layers in graph-based dependency schemes.
We propose a simple but widely applicable architecture and configuration, achieving new state-of-the-art results (in terms of LAS) for 10 out of 12 diverse languages.
arXiv Detail & Related papers (2020-10-23T22:58:26Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.