First order linear logic and tensor type calculus for categorial
grammars
- URL: http://arxiv.org/abs/2112.15253v1
- Date: Fri, 31 Dec 2021 00:35:48 GMT
- Title: First order linear logic and tensor type calculus for categorial
grammars
- Authors: Sergey Slavnov
- Abstract summary: We study relationship between first order multiplicative linear logic (MLL1) and extended tensor type calculus (ETTC)
We identify a fragment of MLL1, which seems sufficient for many grammar representations, and establish a correspondence between ETTC and this fragment.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We study relationship between first order multiplicative linear logic (MLL1),
which has been known to provide representations to different categorial
grammars, and the recently introduced extended tensor type calculus (ETTC). We
identify a fragment of MLL1, which seems sufficient for many grammar
representations, and establish a correspondence between ETTC and this fragment.
The system ETTC, thus, can be seen as an alternative syntax and intrinsic
deductive system together with a geometric representation for the latter. We
also give a natural deduction formulation of ETTC, which might be convenient.
Related papers
- Understanding Matrix Function Normalizations in Covariance Pooling through the Lens of Riemannian Geometry [63.694184882697435]
Global Covariance Pooling (GCP) has been demonstrated to improve the performance of Deep Neural Networks (DNNs) by exploiting second-order statistics of high-level representations.
arXiv Detail & Related papers (2024-07-15T07:11:44Z) - On the Representational Capacity of Neural Language Models with Chain-of-Thought Reasoning [87.73401758641089]
Chain-of-thought (CoT) reasoning has improved the performance of modern language models (LMs)
We show that LMs can represent the same family of distributions over strings as probabilistic Turing machines.
arXiv Detail & Related papers (2024-06-20T10:59:02Z) - Linearity of Relation Decoding in Transformer Language Models [82.47019600662874]
Much of the knowledge encoded in transformer language models (LMs) may be expressed in terms of relations.
We show that, for a subset of relations, this computation is well-approximated by a single linear transformation on the subject representation.
arXiv Detail & Related papers (2023-08-17T17:59:19Z) - Making first order linear logic a generating grammar [0.0]
It is known that different categorial grammars have surface representation in a fragment of first order multiplicative linear logic (MLL1)
We show that the fragment of interest is equivalent to the recently introduced extended type calculus (ETTC)
arXiv Detail & Related papers (2022-06-17T18:11:34Z) - Learning First-Order Rules with Differentiable Logic Program Semantics [12.360002779872373]
We introduce a differentiable inductive logic programming model, called differentiable first-order rule learner (DFOL)
DFOL finds the correct LPs from relational facts by searching for the interpretable matrix representations of LPs.
Experimental results indicate that DFOL is a precise, robust, scalable, and computationally cheap differentiable ILP model.
arXiv Detail & Related papers (2022-04-28T15:33:43Z) - Cobordisms and commutative categorial grammars [0.0]
We propose a concrete surface representation of abstract categorial grammars in the category of word cobordisms or cowordisms for short.
We also introduce and study linear logic grammars, directly based on cobordisms and using classical multiplicative linear logic as a typing system.
arXiv Detail & Related papers (2021-07-19T09:55:21Z) - Structured Reordering for Modeling Latent Alignments in Sequence
Transduction [86.94309120789396]
We present an efficient dynamic programming algorithm performing exact marginal inference of separable permutations.
The resulting seq2seq model exhibits better systematic generalization than standard models on synthetic problems and NLP tasks.
arXiv Detail & Related papers (2021-06-06T21:53:54Z) - A Differentiable Relaxation of Graph Segmentation and Alignment for AMR
Parsing [75.36126971685034]
We treat alignment and segmentation as latent variables in our model and induce them as part of end-to-end training.
Our method also approaches that of a model that relies on citetLyu2018AMRPA's segmentation rules, which were hand-crafted to handle individual AMR constructions.
arXiv Detail & Related papers (2020-10-23T21:22:50Z) - Montague Grammar Induction [4.321645312120979]
This framework provides the analyst fine-grained control over the assumptions that the induced grammar should conform to.
We focus on the relationship between s(emantic)-selection and c(ategory)-selection, using as input a lexicon-scale acceptability judgment dataset.
arXiv Detail & Related papers (2020-10-15T23:25:01Z) - Logical foundations for hybrid type-logical grammars [0.0]
This paper explores proof-theoretic aspects of hybrid type-logical grammars.
We prove some basic properties of the calculus, such as normalisation and the subformula property.
We present both a sequent and a proof net calculus for hybrid type-logical grammars.
arXiv Detail & Related papers (2020-09-22T08:26:14Z) - Generative Language Modeling for Automated Theorem Proving [94.01137612934842]
This work is motivated by the possibility that a major limitation of automated theorem provers compared to humans might be addressable via generation from language models.
We present an automated prover and proof assistant, GPT-f, for the Metamath formalization language, and analyze its performance.
arXiv Detail & Related papers (2020-09-07T19:50:10Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.