The Compositional Structure of Bayesian Inference
- URL: http://arxiv.org/abs/2305.06112v2
- Date: Thu, 20 Jul 2023 09:13:06 GMT
- Title: The Compositional Structure of Bayesian Inference
- Authors: Dylan Braithwaite, Jules Hedges, Toby St Clere Smithe
- Abstract summary: Bayes' rule tells us how to invert a causal process in order to update our beliefs in light of new evidence.
We study the structure of this compositional rule, noting that it relates to the lens pattern in functional programming.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Bayes' rule tells us how to invert a causal process in order to update our
beliefs in light of new evidence. If the process is believed to have a complex
compositional structure, we may observe that the inversion of the whole can be
computed piecewise in terms of the component processes. We study the structure
of this compositional rule, noting that it relates to the lens pattern in
functional programming. Working in a suitably general axiomatic presentation of
a category of Markov kernels, we see how we can think of Bayesian inversion as
a particular instance of a state-dependent morphism in a fibred category. We
discuss the compositional nature of this, formulated as a functor on the
underlying category and explore how this can used for a more type-driven
approach to statistical inference.
Related papers
- Self-Attention as a Parametric Endofunctor: A Categorical Framework for Transformer Architectures [0.0]
We develop a category-theoretic framework focusing on the linear components of self-attention.
We show that the query, key, and value maps naturally define a parametric 1-morphism in the 2-category $mathbfPara(Vect)$.
stacking multiple self-attention layers corresponds to constructing the free monad on this endofunctor.
arXiv Detail & Related papers (2025-01-06T11:14:18Z) - A Compositional Atlas for Algebraic Circuits [35.95450187283255]
We show that a large class of queries correspond to a combination of basic operators over semirings: aggregation, product, and elementwise mapping.
Applying our analysis, we derive novel tractability conditions for many such compositional queries.
arXiv Detail & Related papers (2024-12-07T00:51:46Z) - Compositional Structures in Neural Embedding and Interaction Decompositions [101.40245125955306]
We describe a basic correspondence between linear algebraic structures within vector embeddings in artificial neural networks.
We introduce a characterization of compositional structures in terms of "interaction decompositions"
We establish necessary and sufficient conditions for the presence of such structures within the representations of a model.
arXiv Detail & Related papers (2024-07-12T02:39:50Z) - What makes Models Compositional? A Theoretical View: With Supplement [60.284698521569936]
We propose a general neuro-symbolic definition of compositional functions and their compositional complexity.
We show how various existing general and special purpose sequence processing models fit this definition and use it to analyze their compositional complexity.
arXiv Detail & Related papers (2024-05-02T20:10:27Z) - Nonparametric Partial Disentanglement via Mechanism Sparsity: Sparse
Actions, Interventions and Sparse Temporal Dependencies [58.179981892921056]
This work introduces a novel principle for disentanglement we call mechanism sparsity regularization.
We propose a representation learning method that induces disentanglement by simultaneously learning the latent factors.
We show that the latent factors can be recovered by regularizing the learned causal graph to be sparse.
arXiv Detail & Related papers (2024-01-10T02:38:21Z) - HiPerformer: Hierarchically Permutation-Equivariant Transformer for Time
Series Forecasting [56.95572957863576]
We propose a hierarchically permutation-equivariant model that considers both the relationship among components in the same group and the relationship among groups.
The experiments conducted on real-world data demonstrate that the proposed method outperforms existing state-of-the-art methods.
arXiv Detail & Related papers (2023-05-14T05:11:52Z) - Linear Spaces of Meanings: Compositional Structures in Vision-Language
Models [110.00434385712786]
We investigate compositional structures in data embeddings from pre-trained vision-language models (VLMs)
We first present a framework for understanding compositional structures from a geometric perspective.
We then explain what these structures entail probabilistically in the case of VLM embeddings, providing intuitions for why they arise in practice.
arXiv Detail & Related papers (2023-02-28T08:11:56Z) - Mathematical Foundations for a Compositional Account of the Bayesian
Brain [0.0]
We use the tools of contemporary applied category theory to supply functorial semantics for approximate inference.
We define fibrations of statistical games and classify various problems of statistical inference as corresponding sections.
We construct functors which explain the compositional structure of predictive coding neural circuits under the free energy principle.
arXiv Detail & Related papers (2022-12-23T18:58:17Z) - Equivariance with Learned Canonicalization Functions [77.32483958400282]
We show that learning a small neural network to perform canonicalization is better than using predefineds.
Our experiments show that learning the canonicalization function is competitive with existing techniques for learning equivariant functions across many tasks.
arXiv Detail & Related papers (2022-11-11T21:58:15Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.