A Paradigm Change for Formal Syntax: Computational Algorithms in the
Grammar of English
- URL: http://arxiv.org/abs/2205.12825v1
- Date: Tue, 24 May 2022 07:28:47 GMT
- Title: A Paradigm Change for Formal Syntax: Computational Algorithms in the
Grammar of English
- Authors: Anat Ninio
- Abstract summary: We turn to programming languages as models for a process-based syntax of English.
The combination of a functional word and a content word was chosen as the topic of modeling.
The fit of the model was tested by deriving three functional characteristics crucial for the algorithm and checking their presence in English grammar.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Language sciences rely less and less on formal syntax as their base. The
reason is probably its lack of psychological reality, knowingly avoided.
Philosophers of science call for a paradigm shift in which explanations are by
mechanisms, as in biology. We turned to programming languages as heuristic
models for a process-based syntax of English. The combination of a functional
word and a content word was chosen as the topic of modeling. Such combinations
are very frequent, and their output is the important immediate constituents of
sentences. We found their parallel in Object Oriented Programming where an
all-methods element serves as an interface, and the content-full element serves
as its implementation, defining computational objects. The fit of the model was
tested by deriving three functional characteristics crucial for the algorithm
and checking their presence in English grammar. We tested the reality of the
interface-implementation mechanism on psycholinguistic and neurolinguistic
evidence concerning processing, development and loss of syntax. The close fit
and psychological reality of the mechanism suggests that a paradigm shift to an
algorithmic theory of syntax is a possibility.
Related papers
- A Complexity-Based Theory of Compositionality [53.025566128892066]
In AI, compositional representations can enable a powerful form of out-of-distribution generalization.
Here, we propose a formal definition of compositionality that accounts for and extends our intuitions about compositionality.
The definition is conceptually simple, quantitative, grounded in algorithmic information theory, and applicable to any representation.
arXiv Detail & Related papers (2024-10-18T18:37:27Z) - Mobile Sequencers [0.0]
The article is an attempt to contribute to explorations of a common origin for language and planned-collaborative action.
It gives semantics of change' the central stage in the synthesis, from its history and recordkeeping to its development, its syntax, delivery and reception.
arXiv Detail & Related papers (2024-05-09T12:39:50Z) - Language Evolution with Deep Learning [49.879239655532324]
Computational modeling plays an essential role in the study of language emergence.
It aims to simulate the conditions and learning processes that could trigger the emergence of a structured language.
This chapter explores another class of computational models that have recently revolutionized the field of machine learning: deep learning models.
arXiv Detail & Related papers (2024-03-18T16:52:54Z) - Generative Models as a Complex Systems Science: How can we make sense of
large language model behavior? [75.79305790453654]
Coaxing out desired behavior from pretrained models, while avoiding undesirable ones, has redefined NLP.
We argue for a systematic effort to decompose language model behavior into categories that explain cross-task performance.
arXiv Detail & Related papers (2023-07-31T22:58:41Z) - From Word Models to World Models: Translating from Natural Language to
the Probabilistic Language of Thought [124.40905824051079]
We propose rational meaning construction, a computational framework for language-informed thinking.
We frame linguistic meaning as a context-sensitive mapping from natural language into a probabilistic language of thought.
We show that LLMs can generate context-sensitive translations that capture pragmatically-appropriate linguistic meanings.
We extend our framework to integrate cognitively-motivated symbolic modules.
arXiv Detail & Related papers (2023-06-22T05:14:00Z) - APOLLO: A Simple Approach for Adaptive Pretraining of Language Models
for Logical Reasoning [73.3035118224719]
We propose APOLLO, an adaptively pretrained language model that has improved logical reasoning abilities.
APOLLO performs comparably on ReClor and outperforms baselines on LogiQA.
arXiv Detail & Related papers (2022-12-19T07:40:02Z) - Categorical Tools for Natural Language Processing [0.0]
This thesis develops the translation between category theory and computational linguistics.
The three chapters deal with syntax, semantics and pragmatics.
The resulting functorial models can be composed to form games where equilibria are the solutions of language processing tasks.
arXiv Detail & Related papers (2022-12-13T15:12:37Z) - Category Theory for Quantum Natural Language Processing [0.0]
This thesis introduces quantum natural language processing (QNLP) models based on an analogy between computational linguistics and quantum mechanics.
The grammatical structure of text and sentences connects the meaning of words in the same way that entanglement structure connects the states of quantum systems.
We turn this abstract analogy into a concrete algorithm that translates the grammatical structure onto the architecture of parameterised quantum circuits.
We then use a hybrid classical-quantum algorithm to train the model so that evaluating the circuits computes the meaning of sentences in data-driven tasks.
arXiv Detail & Related papers (2022-12-13T14:38:57Z) - Benchmarking Language Models for Code Syntax Understanding [79.11525961219591]
Pre-trained language models have demonstrated impressive performance in both natural language processing and program understanding.
In this work, we perform the first thorough benchmarking of the state-of-the-art pre-trained models for identifying the syntactic structures of programs.
Our findings point out key limitations of existing pre-training methods for programming languages, and suggest the importance of modeling code syntactic structures.
arXiv Detail & Related papers (2022-10-26T04:47:18Z) - Generalized Optimal Linear Orders [9.010643838773477]
The sequential structure of language, and the order of words in a sentence specifically, plays a central role in human language processing.
In designing computational models of language, the de facto approach is to present sentences to machines with the words ordered in the same order as in the original human-authored sentence.
The very essence of this work is to question the implicit assumption that this is desirable and inject theoretical soundness into the consideration of word order in natural language processing.
arXiv Detail & Related papers (2021-08-13T13:10:15Z) - Modelling Compositionality and Structure Dependence in Natural Language [0.12183405753834563]
Drawing on linguistics and set theory, a formalisation of these ideas is presented in the first half of this thesis.
We see how cognitive systems that process language need to have certain functional constraints.
Using the advances of word embedding techniques, a model of relational learning is simulated.
arXiv Detail & Related papers (2020-11-22T17:28:50Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.