Dynamic Syntax Mapping: A New Approach to Unsupervised Syntax Parsing
- URL: http://arxiv.org/abs/2312.14966v1
- Date: Mon, 18 Dec 2023 10:34:29 GMT
- Title: Dynamic Syntax Mapping: A New Approach to Unsupervised Syntax Parsing
- Authors: Buvarp Gohsh, Woods Ali, Anders Michael
- Abstract summary: This study investigates the premise that language models, specifically their attention distributions, can encapsulate syntactic dependencies.
We introduce Dynamic Syntax Mapping (DSM), an innovative approach for the induction of these structures.
Our findings reveal that the use of an increasing array of substitutions notably enhances parsing precision on natural language data.
- Score: 0.0
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: The intricate hierarchical structure of syntax is fundamental to the
intricate and systematic nature of human language. This study investigates the
premise that language models, specifically their attention distributions, can
encapsulate syntactic dependencies. We introduce Dynamic Syntax Mapping (DSM),
an innovative approach for the agnostic induction of these structures. Our
method diverges from traditional syntax models which rely on predefined
annotation schemata. Instead, we focus on a core characteristic inherent in
dependency relations: syntactic substitutability. This concept refers to the
interchangeability of words within the same syntactic category at either end of
a dependency. By leveraging this property, we generate a collection of
syntactically invariant sentences, which serve as the foundation for our
parsing framework. Our findings reveal that the use of an increasing array of
substitutions notably enhances parsing precision on natural language data.
Specifically, in the context of long-distance subject-verb agreement, DSM
exhibits a remarkable advancement over prior methodologies. Furthermore, DSM's
adaptability is demonstrated through its successful application in varied
parsing scenarios, underscoring its broad applicability.
Related papers
- A Hybrid Approach To Aspect Based Sentiment Analysis Using Transfer Learning [3.30307212568497]
We propose a hybrid approach for Aspect Based Sentiment Analysis using transfer learning.
The approach focuses on generating weakly-supervised annotations by exploiting the strengths of both large language models (LLM) and traditional syntactic dependencies.
arXiv Detail & Related papers (2024-03-25T23:02:33Z) - How Well Do Text Embedding Models Understand Syntax? [50.440590035493074]
The ability of text embedding models to generalize across a wide range of syntactic contexts remains under-explored.
Our findings reveal that existing text embedding models have not sufficiently addressed these syntactic understanding challenges.
We propose strategies to augment the generalization ability of text embedding models in diverse syntactic scenarios.
arXiv Detail & Related papers (2023-11-14T08:51:00Z) - Variational Cross-Graph Reasoning and Adaptive Structured Semantics
Learning for Compositional Temporal Grounding [143.5927158318524]
Temporal grounding is the task of locating a specific segment from an untrimmed video according to a query sentence.
We introduce a new Compositional Temporal Grounding task and construct two new dataset splits.
We argue that the inherent structured semantics inside the videos and language is the crucial factor to achieve compositional generalization.
arXiv Detail & Related papers (2023-01-22T08:02:23Z) - Semantic Role Labeling Meets Definition Modeling: Using Natural Language
to Describe Predicate-Argument Structures [104.32063681736349]
We present an approach to describe predicate-argument structures using natural language definitions instead of discrete labels.
Our experiments and analyses on PropBank-style and FrameNet-style, dependency-based and span-based SRL also demonstrate that a flexible model with an interpretable output does not necessarily come at the expense of performance.
arXiv Detail & Related papers (2022-12-02T11:19:16Z) - Syntactic Substitutability as Unsupervised Dependency Syntax [31.488677474152794]
We model a more general property implicit in the definition of dependency relations, syntactic substitutability.
This property captures the fact that words at either end of a dependency can be substituted with words from the same category.
We show that increasing the number of substitutions used improves parsing accuracy on natural data.
arXiv Detail & Related papers (2022-11-29T09:01:37Z) - Graph Adaptive Semantic Transfer for Cross-domain Sentiment
Classification [68.06496970320595]
Cross-domain sentiment classification (CDSC) aims to use the transferable semantics learned from the source domain to predict the sentiment of reviews in the unlabeled target domain.
We present Graph Adaptive Semantic Transfer (GAST) model, an adaptive syntactic graph embedding method that is able to learn domain-invariant semantics from both word sequences and syntactic graphs.
arXiv Detail & Related papers (2022-05-18T07:47:01Z) - Plurality and Quantification in Graph Representation of Meaning [4.82512586077023]
Our graph language covers the essentials of natural language semantics using only monadic second-order variables.
We present a unification-based mechanism for constructing semantic graphs at a simple syntax-semantics interface.
The present graph formalism is applied to linguistic issues in distributive predication, cross-categorial conjunction, and scope permutation of quantificational expressions.
arXiv Detail & Related papers (2021-12-13T07:04:41Z) - Infusing Finetuning with Semantic Dependencies [62.37697048781823]
We show that, unlike syntax, semantics is not brought to the surface by today's pretrained models.
We then use convolutional graph encoders to explicitly incorporate semantic parses into task-specific finetuning.
arXiv Detail & Related papers (2020-12-10T01:27:24Z) - A Dependency Syntactic Knowledge Augmented Interactive Architecture for
End-to-End Aspect-based Sentiment Analysis [73.74885246830611]
We propose a novel dependency syntactic knowledge augmented interactive architecture with multi-task learning for end-to-end ABSA.
This model is capable of fully exploiting the syntactic knowledge (dependency relations and types) by leveraging a well-designed Dependency Relation Embedded Graph Convolutional Network (DreGcn)
Extensive experimental results on three benchmark datasets demonstrate the effectiveness of our approach.
arXiv Detail & Related papers (2020-04-04T14:59:32Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.