A Survey of Unsupervised Dependency Parsing
- URL: http://arxiv.org/abs/2010.01535v1
- Date: Sun, 4 Oct 2020 10:51:22 GMT
- Title: A Survey of Unsupervised Dependency Parsing
- Authors: Wenjuan Han, Yong Jiang, Hwee Tou Ng, Kewei Tu
- Abstract summary: Unsupervised dependency parsing aims to learn a dependency from sentences that have no annotation of their correct parse trees.
Despite its difficulty, unsupervised parsing is an interesting research direction because of its capability of utilizing almost unlimited unannotated text data.
- Score: 62.16714720135358
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Syntactic dependency parsing is an important task in natural language
processing. Unsupervised dependency parsing aims to learn a dependency parser
from sentences that have no annotation of their correct parse trees. Despite
its difficulty, unsupervised parsing is an interesting research direction
because of its capability of utilizing almost unlimited unannotated text data.
It also serves as the basis for other research in low-resource parsing. In this
paper, we survey existing approaches to unsupervised dependency parsing,
identify two major classes of approaches, and discuss recent trends. We hope
that our survey can provide insights for researchers and facilitate future
research on this topic.
Related papers
- What's Hard in English RST Parsing? Predictive Models for Error Analysis [16.927386793787463]
In this paper, we examine and model some of the factors associated with parsing difficulties in Rhetorical Structure Theory.
Our results show that as in shallow discourse parsing, the explicit/implicit distinction plays a role, but that long-distance dependencies are the main challenge.
Our final model is able to predict where errors will occur with an accuracy of 76.3% for the bottom-up and 76.6% for the top-down.
arXiv Detail & Related papers (2023-09-10T06:10:03Z) - Cascading and Direct Approaches to Unsupervised Constituency Parsing on
Spoken Sentences [67.37544997614646]
We present the first study on unsupervised spoken constituency parsing.
The goal is to determine the spoken sentences' hierarchical syntactic structure in the form of constituency parse trees.
We show that accurate segmentation alone may be sufficient to parse spoken sentences accurately.
arXiv Detail & Related papers (2023-03-15T17:57:22Z) - Discourse Analysis via Questions and Answers: Parsing Dependency
Structures of Questions Under Discussion [57.43781399856913]
This work adopts the linguistic framework of Questions Under Discussion (QUD) for discourse analysis.
We characterize relationships between sentences as free-form questions, in contrast to exhaustive fine-grained questions.
We develop the first-of-its-kind QUD that derives a dependency structure of questions over full documents.
arXiv Detail & Related papers (2022-10-12T03:53:12Z) - Linguistic dependencies and statistical dependence [76.89273585568084]
We use pretrained language models to estimate probabilities of words in context.
We find that maximum-CPMI trees correspond to linguistic dependencies more often than trees extracted from non-contextual PMI estimate.
arXiv Detail & Related papers (2021-04-18T02:43:37Z) - Multilingual Neural RST Discourse Parsing [24.986030179701405]
We investigate two approaches to establish a neural, cross-lingual discourse via multilingual vector representations and segment-level translation.
Experiment results show that both methods are effective even with limited training data, and achieve state-of-the-art performance on cross-lingual, document-level discourse parsing.
arXiv Detail & Related papers (2020-12-03T05:03:38Z) - Context Dependent Semantic Parsing: A Survey [56.69006903481575]
semantic parsing is the task of translating natural language utterances into machine-readable meaning representations.
Currently, most semantic parsing methods are not able to utilize contextual information.
To address this issue, context dependent semantic parsing has recently drawn a lot of attention.
arXiv Detail & Related papers (2020-11-02T07:51:05Z) - Pareto Probing: Trading Off Accuracy for Complexity [87.09294772742737]
We argue for a probe metric that reflects the fundamental trade-off between probe complexity and performance.
Our experiments with dependency parsing reveal a wide gap in syntactic knowledge between contextual and non-contextual representations.
arXiv Detail & Related papers (2020-10-05T17:27:31Z) - A Survey of Syntactic-Semantic Parsing Based on Constituent and
Dependency Structures [14.714725860010724]
We focus on two of the most popular formalizations of parsing: constituent parsing and dependency parsing.
This article briefly reviews the representative models of constituent parsing and dependency parsing, and also dependency parsing with rich semantics.
arXiv Detail & Related papers (2020-06-19T10:21:17Z) - Is POS Tagging Necessary or Even Helpful for Neural Dependency Parsing? [22.93722845643562]
We show that POS tagging can still significantly improve parsing performance when using the Stack joint framework.
Considering that it is much cheaper to annotate POS tags than parse trees, we also investigate the utilization of large-scale heterogeneous POS tag data.
arXiv Detail & Related papers (2020-03-06T13:47:30Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.