Mathematical Foundations for a Compositional Account of the Bayesian
Brain
- URL: http://arxiv.org/abs/2212.12538v3
- Date: Tue, 19 Dec 2023 15:25:42 GMT
- Title: Mathematical Foundations for a Compositional Account of the Bayesian
Brain
- Authors: Toby St Clere Smithe
- Abstract summary: We use the tools of contemporary applied category theory to supply functorial semantics for approximate inference.
We define fibrations of statistical games and classify various problems of statistical inference as corresponding sections.
We construct functors which explain the compositional structure of predictive coding neural circuits under the free energy principle.
- Score: 0.0
- License: http://creativecommons.org/licenses/by-sa/4.0/
- Abstract: This dissertation reports some first steps towards a compositional account of
active inference and the Bayesian brain. Specifically, we use the tools of
contemporary applied category theory to supply functorial semantics for
approximate inference. To do so, we define on the `syntactic' side the new
notion of Bayesian lens and show that Bayesian updating composes according to
the compositional lens pattern. Using Bayesian lenses, and inspired by
compositional game theory, we define fibrations of statistical games and
classify various problems of statistical inference as corresponding sections:
the chain rule of the relative entropy is formalized as a strict section, while
maximum likelihood estimation and the free energy give lax sections. In the
process, we introduce a new notion of `copy-composition'.
On the `semantic' side, we present a new formalization of general open
dynamical systems (particularly: deterministic, stochastic, and random; and
discrete- and continuous-time) as certain coalgebras of polynomial functors,
which we show collect into monoidal opindexed categories (or, alternatively,
into algebras for multicategories of generalized polynomial functors). We use
these opindexed categories to define monoidal bicategories of cilia: dynamical
systems which control lenses, and which supply the target for our functorial
semantics. Accordingly, we construct functors which explain the bidirectional
compositional structure of predictive coding neural circuits under the free
energy principle, thereby giving a formal mathematical underpinning to the
bidirectionality observed in the cortex. Along the way, we explain how to
compose rate-coded neural circuits using an algebra for a multicategory of
linear circuit diagrams, showing subsequently that this is subsumed by lenses
and polynomial functors.
Related papers
- Relative Representations: Topological and Geometric Perspectives [53.88896255693922]
Relative representations are an established approach to zero-shot model stitching.
We introduce a normalization procedure in the relative transformation, resulting in invariance to non-isotropic rescalings and permutations.
Second, we propose to deploy topological densification when fine-tuning relative representations, a topological regularization loss encouraging clustering within classes.
arXiv Detail & Related papers (2024-09-17T08:09:22Z) - Understanding Matrix Function Normalizations in Covariance Pooling through the Lens of Riemannian Geometry [63.694184882697435]
Global Covariance Pooling (GCP) has been demonstrated to improve the performance of Deep Neural Networks (DNNs) by exploiting second-order statistics of high-level representations.
arXiv Detail & Related papers (2024-07-15T07:11:44Z) - A Recursive Bateson-Inspired Model for the Generation of Semantic Formal
Concepts from Spatial Sensory Data [77.34726150561087]
This paper presents a new symbolic-only method for the generation of hierarchical concept structures from complex sensory data.
The approach is based on Bateson's notion of difference as the key to the genesis of an idea or a concept.
The model is able to produce fairly rich yet human-readable conceptual representations without training.
arXiv Detail & Related papers (2023-07-16T15:59:13Z) - The Compositional Structure of Bayesian Inference [0.0]
Bayes' rule tells us how to invert a causal process in order to update our beliefs in light of new evidence.
We study the structure of this compositional rule, noting that it relates to the lens pattern in functional programming.
arXiv Detail & Related papers (2023-05-10T12:57:42Z) - Equivariance with Learned Canonicalization Functions [77.32483958400282]
We show that learning a small neural network to perform canonicalization is better than using predefineds.
Our experiments show that learning the canonicalization function is competitive with existing techniques for learning equivariant functions across many tasks.
arXiv Detail & Related papers (2022-11-11T21:58:15Z) - Multiparameter Persistent Homology-Generic Structures and Quantum
Computing [0.0]
This article is an application of commutative algebra to the study of persistent homology in topological data analysis.
The generic structure of such resolutions and the classifying spaces are studied using results spanning several decades of research.
arXiv Detail & Related papers (2022-10-20T17:30:20Z) - Compositional Active Inference II: Polynomial Dynamics. Approximate
Inference Doctrines [0.0]
We develop the necessary theory of dynamical inference using the language of functors.
We then describe externallyalgebraized'' statistical games, and use them to construct two approximate inference doctrines.
The former produces systems which optimize the posteriors of Gaussian models; and the latter produces systems which additionally optimize the parameters (or weights') which determine their predictions.
arXiv Detail & Related papers (2022-08-25T15:58:33Z) - The Many-Worlds Calculus [0.0]
We propose a colored PROP to model computation in this framework.
The model can support regular tests, probabilistic and non-deterministic branching, as well as quantum branching.
We prove the language to be universal, and the equational theory to be complete with respect to this semantics.
arXiv Detail & Related papers (2022-06-21T10:10:26Z) - Learning Algebraic Recombination for Compositional Generalization [71.78771157219428]
We propose LeAR, an end-to-end neural model to learn algebraic recombination for compositional generalization.
Key insight is to model the semantic parsing task as a homomorphism between a latent syntactic algebra and a semantic algebra.
Experiments on two realistic and comprehensive compositional generalization demonstrate the effectiveness of our model.
arXiv Detail & Related papers (2021-07-14T07:23:46Z) - Categories of Br\`egman operations and epistemic (co)monads [0.0]
We construct a categorical framework for nonlinear postquantum inference, with embeddings of convex closed sets of suitable reflexive Banach spaces as objects.
It provides a nonlinear convex analytic analogue of Chencov's programme of study of categories of linear positive maps between spaces of states.
We show that the bregmanian approach provides some special cases of this setting.
arXiv Detail & Related papers (2021-03-13T23:10:29Z) - Finite-Function-Encoding Quantum States [52.77024349608834]
We introduce finite-function-encoding (FFE) states which encode arbitrary $d$-valued logic functions.
We investigate some of their structural properties.
arXiv Detail & Related papers (2020-12-01T13:53:23Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.