Fuzzy Generalised Quantifiers for Natural Language in Categorical
Compositional Distributional Semantics
- URL: http://arxiv.org/abs/2109.11227v1
- Date: Thu, 23 Sep 2021 09:15:15 GMT
- Title: Fuzzy Generalised Quantifiers for Natural Language in Categorical
Compositional Distributional Semantics
- Authors: Matej Dostal, Mehrnoosh Sadrzadeh, Gijs Wijnholds
- Abstract summary: We consider fuzzy versions of quantifiers along the lines of Zadeh.
We show that this category is a concrete instantiation of the compositional distributional model.
- Score: 5.2424255020469595
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Recent work on compositional distributional models shows that bialgebras over
finite dimensional vector spaces can be applied to treat generalised
quantifiers for natural language. That technique requires one to construct the
vector space over powersets, and therefore is computationally costly. In this
paper, we overcome this problem by considering fuzzy versions of quantifiers
along the lines of Zadeh, within the category of many valued relations. We show
that this category is a concrete instantiation of the compositional
distributional model. We show that the semantics obtained in this model is
equivalent to the semantics of the fuzzy quantifiers of Zadeh. As a result, we
are now able to treat fuzzy quantification without requiring a powerset
construction.
Related papers
- What makes Models Compositional? A Theoretical View: With Supplement [60.284698521569936]
We propose a general neuro-symbolic definition of compositional functions and their compositional complexity.
We show how various existing general and special purpose sequence processing models fit this definition and use it to analyze their compositional complexity.
arXiv Detail & Related papers (2024-05-02T20:10:27Z) - Quantization of Large Language Models with an Overdetermined Basis [73.79368761182998]
We introduce an algorithm for data quantization based on the principles of Kashin representation.
Our findings demonstrate that Kashin Quantization achieves competitive or superior quality in model performance.
arXiv Detail & Related papers (2024-04-15T12:38:46Z) - Enriching Diagrams with Algebraic Operations [49.1574468325115]
We extend diagrammatic reasoning in monoidal categories with algebraic operations and equations.
We show how this construction can be used for diagrammatic reasoning of noise in quantum systems.
arXiv Detail & Related papers (2023-10-17T14:12:39Z) - Linear Spaces of Meanings: Compositional Structures in Vision-Language
Models [110.00434385712786]
We investigate compositional structures in data embeddings from pre-trained vision-language models (VLMs)
We first present a framework for understanding compositional structures from a geometric perspective.
We then explain what these structures entail probabilistically in the case of VLM embeddings, providing intuitions for why they arise in practice.
arXiv Detail & Related papers (2023-02-28T08:11:56Z) - Structural generalization is hard for sequence-to-sequence models [85.0087839979613]
Sequence-to-sequence (seq2seq) models have been successful across many NLP tasks.
Recent work on compositional generalization has shown that seq2seq models achieve very low accuracy in generalizing to linguistic structures that were not seen in training.
arXiv Detail & Related papers (2022-10-24T09:03:03Z) - Learning Algebraic Recombination for Compositional Generalization [71.78771157219428]
We propose LeAR, an end-to-end neural model to learn algebraic recombination for compositional generalization.
Key insight is to model the semantic parsing task as a homomorphism between a latent syntactic algebra and a semantic algebra.
Experiments on two realistic and comprehensive compositional generalization demonstrate the effectiveness of our model.
arXiv Detail & Related papers (2021-07-14T07:23:46Z) - Language Modeling with Reduced Densities [0.0]
We show that sequences of symbols from a finite alphabet, such as those found in a corpus of text, form a category enriched over probabilities.
We then address a second fundamental question: How can this information be stored and modeled in a way that preserves the categorical structure?
arXiv Detail & Related papers (2020-07-08T00:41:53Z) - Seq2Tens: An Efficient Representation of Sequences by Low-Rank Tensor
Projections [11.580603875423408]
Sequential data such as time series, video, or text can be challenging to analyse.
At the heart of this is non-commutativity, in the sense that reordering the elements of a sequence can completely change its meaning.
We use a classical mathematical object -- the tensor algebra -- to capture such dependencies.
arXiv Detail & Related papers (2020-06-12T09:24:35Z) - Linguists Who Use Probabilistic Models Love Them: Quantification in
Functional Distributional Semantics [12.640283469603355]
I show how the previous formulation gives trivial truth values when a precise quantifier is used with vague predicates.
I propose an improved account, avoiding this problem by treating a vague predicate as a distribution over precise predicates.
I explain how the generic quantifier can be both pragmatically complex and yet computationally simpler than precise quantifiers.
arXiv Detail & Related papers (2020-06-04T16:48:45Z) - Towards logical negation for compositional distributional semantics [2.449372198427156]
The categorical compositional distributional model of meaning gives the composition of words into phrases and sentences pride of place.
This paper gives some steps towards providing this operator, modelling it as a version of projection onto the subspace to a word.
arXiv Detail & Related papers (2020-05-11T08:51:30Z) - Categorical Vector Space Semantics for Lambek Calculus with a Relevant
Modality [3.345437353879255]
We develop a categorical distributional semantics for Lambek Calculus with a Relevantity!L*.
We instantiate this category to finite dimensional vector spaces and linear maps via "quantisation" functors.
We apply the model to construct categorical and concrete semantic interpretations for the motivating example of!L*: the derivation of a phrase with a parasitic gap.
arXiv Detail & Related papers (2020-05-06T18:58:21Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.