The Combinatorics of \textit{Salva Veritate} Principles
- URL: http://arxiv.org/abs/2201.05173v1
- Date: Thu, 13 Jan 2022 19:00:56 GMT
- Title: The Combinatorics of \textit{Salva Veritate} Principles
- Authors: Norman E. Trushaev
- Abstract summary: Concepts of grammatical compositionality arise in many theories of both natural and artificial languages.
We propose that many instances of compositionality should entail non-trivial claims about the expressive power of languages.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Various concepts of grammatical compositionality arise in many theories of
both natural and artificial languages, and often play a key role in accounts of
the syntax-semantics interface. We propose that many instances of
compositionality should entail non-trivial combinatorial claims about the
expressive power of languages which satisfy these compositional properties. As
an example, we present a formal analysis demonstrating that a particular class
of languages which admit salva vertitate substitutions - a property which we
claim to be a particularly strong example of compositional principle - must
also satisfy a very natural combinatorial constraint identified in this paper.
Related papers
- What makes Models Compositional? A Theoretical View: With Supplement [60.284698521569936]
We propose a general neuro-symbolic definition of compositional functions and their compositional complexity.
We show how various existing general and special purpose sequence processing models fit this definition and use it to analyze their compositional complexity.
arXiv Detail & Related papers (2024-05-02T20:10:27Z) - Position Paper: Generalized grammar rules and structure-based
generalization beyond classical equivariance for lexical tasks and
transduction [7.523978255716284]
We propose a general framework for building models that can generalize compositionally using the concept of Generalized Grammar Rules (GGRs)
Our framework is general enough to contain many existing works as special cases.
We present ideas on how GGRs might be implemented, and in the process draw connections to reinforcement learning and other areas of research.
arXiv Detail & Related papers (2024-02-02T18:44:37Z) - Contribuci\'on de la sem\'antica combinatoria al desarrollo de
herramientas digitales multiling\"ues [0.0]
This paper describes how the field of Combinatorial Semantics has contributed to the design of three prototypes for the automatic generation of argument patterns in nominal phrases in Spanish, French and German.
It also shows the importance of knowing about the argument syntactic-semantic interface in a production situation in the context of foreign languages.
arXiv Detail & Related papers (2023-12-26T19:32:05Z) - An Encoding of Abstract Dialectical Frameworks into Higher-Order Logic [57.24311218570012]
This approach allows for the computer-assisted analysis of abstract dialectical frameworks.
Exemplary applications include the formal analysis and verification of meta-theoretical properties.
arXiv Detail & Related papers (2023-12-08T09:32:26Z) - The Role of Linguistic Priors in Measuring Compositional Generalization
of Vision-Language Models [64.43764443000003]
We identify two sources of visual-linguistic compositionality: linguistic priors and the interplay between images and texts.
We propose a new metric for compositionality without such linguistic priors.
arXiv Detail & Related papers (2023-10-04T12:48:33Z) - Im-Promptu: In-Context Composition from Image Prompts [10.079743487034762]
We investigate whether analogical reasoning can enable in-context composition over composable elements of visual stimuli.
We use Im-Promptu to train agents with different levels of compositionality, including vector representations, patch representations, and object slots.
Our experiments reveal tradeoffs between extrapolation abilities and the degree of compositionality, with non-compositional representations extending learned composition rules to unseen domains but performing poorly on tasks.
arXiv Detail & Related papers (2023-05-26T21:10:11Z) - Geometry of Language [0.0]
We present a fresh perspective on language, combining ideas from various sources, but mixed in a new synthesis.
The question is whether we can formulate an elegant formalism, a universal grammar or a mechanism which explains significant aspects of the human faculty of language.
We describe such a mechanism, which differs from existing logical and grammatical approaches by its geometric nature.
arXiv Detail & Related papers (2023-03-09T12:22:28Z) - Learning Algebraic Recombination for Compositional Generalization [71.78771157219428]
We propose LeAR, an end-to-end neural model to learn algebraic recombination for compositional generalization.
Key insight is to model the semantic parsing task as a homomorphism between a latent syntactic algebra and a semantic algebra.
Experiments on two realistic and comprehensive compositional generalization demonstrate the effectiveness of our model.
arXiv Detail & Related papers (2021-07-14T07:23:46Z) - Decomposing lexical and compositional syntax and semantics with deep
language models [82.81964713263483]
The activations of language transformers like GPT2 have been shown to linearly map onto brain activity during speech comprehension.
Here, we propose a taxonomy to factorize the high-dimensional activations of language models into four classes: lexical, compositional, syntactic, and semantic representations.
The results highlight two findings. First, compositional representations recruit a more widespread cortical network than lexical ones, and encompass the bilateral temporal, parietal and prefrontal cortices.
arXiv Detail & Related papers (2021-03-02T10:24:05Z) - Compositional Generalization via Semantic Tagging [81.24269148865555]
We propose a new decoding framework that preserves the expressivity and generality of sequence-to-sequence models.
We show that the proposed approach consistently improves compositional generalization across model architectures, domains, and semantic formalisms.
arXiv Detail & Related papers (2020-10-22T15:55:15Z) - Conjunctive Queries: Unique Characterizations and Exact Learnability [0.0]
We present a new efficient exact learning algorithm for a class of conjunctive queries.
We also discuss implications for constructing frontiers in the homomorphism lattice of finite structures.
arXiv Detail & Related papers (2020-08-16T02:54:56Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.