Information algebras of coherent sets of gambles
- URL: http://arxiv.org/abs/2102.13368v1
- Date: Fri, 26 Feb 2021 09:36:39 GMT
- Title: Information algebras of coherent sets of gambles
- Authors: Juerg Kohlas, Arianna Casanova, Marco Zaffalon
- Abstract summary: We show that coherent sets of gambles can be embedded into the algebraic structure of information algebra.
This leads to a new perspective of the algebraic and logical structure of desirability.
- Score: 1.697342683039794
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In this paper, we show that coherent sets of gambles can be embedded into the
algebraic structure of information algebra. This leads firstly, to a new
perspective of the algebraic and logical structure of desirability and
secondly, it connects desirability, hence imprecise probabilities, to other
formalism in computer science sharing the same underlying structure. Both the
domain free and the labeled view of the information algebra of coherent sets of
gambles are presented, considering a special case of possibility space.
Related papers
- Compositional Structures in Neural Embedding and Interaction Decompositions [101.40245125955306]
We describe a basic correspondence between linear algebraic structures within vector embeddings in artificial neural networks.
We introduce a characterization of compositional structures in terms of "interaction decompositions"
We establish necessary and sufficient conditions for the presence of such structures within the representations of a model.
arXiv Detail & Related papers (2024-07-12T02:39:50Z) - Enriching Diagrams with Algebraic Operations [49.1574468325115]
We extend diagrammatic reasoning in monoidal categories with algebraic operations and equations.
We show how this construction can be used for diagrammatic reasoning of noise in quantum systems.
arXiv Detail & Related papers (2023-10-17T14:12:39Z) - Linear Spaces of Meanings: Compositional Structures in Vision-Language
Models [110.00434385712786]
We investigate compositional structures in data embeddings from pre-trained vision-language models (VLMs)
We first present a framework for understanding compositional structures from a geometric perspective.
We then explain what these structures entail probabilistically in the case of VLM embeddings, providing intuitions for why they arise in practice.
arXiv Detail & Related papers (2023-02-28T08:11:56Z) - Quantum computing with anyons: an $F$-matrix and braid calculator [0.0]
We introduce a pentagon equation solver, available as part of SageMath, and use it to construct braid group representations associated to certain anyon systems.
We present anyons abstractly as sets of labels together with a collection of data satisfying a number of axioms.
In the language of RFCs, our solver can produce $F$-matrices for anyon systems corresponding to multiplicity-free fusion rings.
arXiv Detail & Related papers (2022-12-01T19:31:17Z) - Knowledgebra: An Algebraic Learning Framework for Knowledge Graph [15.235089177507897]
Knowledge graph (KG) representation learning aims to encode entities and relations into dense continuous vector spaces such that knowledge contained in a dataset could be consistently represented.
We developed a mathematical language for KG based on an observation of their inherent algebraic structure, which we termed as Knowledgebra.
We implemented an instantiation model, SemE, using simple matrix semigroups, which exhibits state-of-the-art performance on standard datasets.
arXiv Detail & Related papers (2022-04-15T04:53:47Z) - Algebras of Sets and Coherent Sets of Gambles [1.697342683039794]
We show how to construct an information algebra of coherent sets of gambles defined on general possibility spaces.
This paper also details how propositional logic is naturally embedded into the theory of imprecise probabilities.
arXiv Detail & Related papers (2021-05-27T08:14:38Z) - Information algebras of coherent sets of gambles in general possibility
spaces [1.697342683039794]
We show that coherent sets of gambles can be embedded into the algebraic structure of information algebra.
This leads to a new perspective of the algebraic and logical structure of desirability.
arXiv Detail & Related papers (2021-05-25T16:18:39Z) - Compositional Processing Emerges in Neural Networks Solving Math
Problems [100.80518350845668]
Recent progress in artificial neural networks has shown that when large models are trained on enough linguistic data, grammatical structure emerges in their representations.
We extend this work to the domain of mathematical reasoning, where it is possible to formulate precise hypotheses about how meanings should be composed.
Our work shows that neural networks are not only able to infer something about the structured relationships implicit in their training data, but can also deploy this knowledge to guide the composition of individual meanings into composite wholes.
arXiv Detail & Related papers (2021-05-19T07:24:42Z) - Lattice Representation Learning [6.427169570069738]
We introduce theory and algorithms for learning discrete representations that take on a lattice that is embedded in an Euclidean space.
Lattice representations possess an interesting combination of properties: a) they can be computed explicitly using lattice quantization, yet they can be learned efficiently using the ideas we introduce.
This article will focus on laying the groundwork for exploring and exploiting the first two properties, including a new mathematical result linking expressions used during training and inference time and experimental validation on two popular datasets.
arXiv Detail & Related papers (2020-06-24T16:05:11Z) - Pairwise Supervision Can Provably Elicit a Decision Boundary [84.58020117487898]
Similarity learning is a problem to elicit useful representations by predicting the relationship between a pair of patterns.
We show that similarity learning is capable of solving binary classification by directly eliciting a decision boundary.
arXiv Detail & Related papers (2020-06-11T05:35:16Z) - A Theory of Usable Information Under Computational Constraints [103.5901638681034]
We propose a new framework for reasoning about information in complex systems.
Our foundation is based on a variational extension of Shannon's information theory.
We show that by incorporating computational constraints, $mathcalV$-information can be reliably estimated from data.
arXiv Detail & Related papers (2020-02-25T06:09:30Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.