Categorical Stochastic Processes and Likelihood
- URL: http://arxiv.org/abs/2005.04735v5
- Date: Sun, 9 Jan 2022 13:05:22 GMT
- Title: Categorical Stochastic Processes and Likelihood
- Authors: Dan Shiebler
- Abstract summary: We take a Category Theoretic perspective on the relationship between probabilistic modeling and function approximation.
We show how these extensions relate to the category Stoch and other Markov Categories.
We conclude with a demonstration of how the Maximum Likelihood Estimation procedure defines an identity-on-objects functor from the category of statistical models to the category of learners.
- Score: 1.14219428942199
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In this work we take a Category Theoretic perspective on the relationship
between probabilistic modeling and function approximation. We begin by defining
two extensions of function composition to stochastic process subordination: one
based on the co-Kleisli category under the comonad (Omega x -) and one based on
the parameterization of a category with a Lawvere theory. We show how these
extensions relate to the category Stoch and other Markov Categories. Next, we
apply the Para construction to extend stochastic processes to parameterized
statistical models and we define a way to compose the likelihood functions of
these models. We conclude with a demonstration of how the Maximum Likelihood
Estimation procedure defines an identity-on-objects functor from the category
of statistical models to the category of Learners. Code to accompany this paper
can be found at
https://github.com/dshieble/Categorical_Stochastic_Processes_and_Likelihood
Related papers
- Embedding-based statistical inference on generative models [10.948308354932639]
We extend results related to embedding-based representations of generative models to classical statistical inference settings.
We demonstrate that using the perspective space as the basis of a notion of "similar" is effective for multiple model-level inference tasks.
arXiv Detail & Related papers (2024-10-01T22:28:39Z) - Explaining Datasets in Words: Statistical Models with Natural Language Parameters [66.69456696878842]
We introduce a family of statistical models -- including clustering, time series, and classification models -- parameterized by natural language predicates.
We apply our framework to a wide range of problems: taxonomizing user chat dialogues, characterizing how they evolve across time, finding categories where one language model is better than the other.
arXiv Detail & Related papers (2024-09-13T01:40:20Z) - String Diagrams with Factorized Densities [0.0]
Both probabilistic programs and causal models define a joint probability density over a set of random variables.
This work builds on work on Markov categories of probabilistic mappings to define a category whose morphisms combine a joint density, factorized over each sample space, with a deterministic mapping from samples to return values.
arXiv Detail & Related papers (2023-05-04T02:30:44Z) - Mathematical Foundations for a Compositional Account of the Bayesian
Brain [0.0]
We use the tools of contemporary applied category theory to supply functorial semantics for approximate inference.
We define fibrations of statistical games and classify various problems of statistical inference as corresponding sections.
We construct functors which explain the compositional structure of predictive coding neural circuits under the free energy principle.
arXiv Detail & Related papers (2022-12-23T18:58:17Z) - Equivariance with Learned Canonicalization Functions [77.32483958400282]
We show that learning a small neural network to perform canonicalization is better than using predefineds.
Our experiments show that learning the canonicalization function is competitive with existing techniques for learning equivariant functions across many tasks.
arXiv Detail & Related papers (2022-11-11T21:58:15Z) - Language Model Cascades [72.18809575261498]
Repeated interactions at test-time with a single model, or the composition of multiple models together, further expands capabilities.
Cases with control flow and dynamic structure require techniques from probabilistic programming.
We formalize several existing techniques from this perspective, including scratchpads / chain of thought, verifiers, STaR, selection-inference, and tool use.
arXiv Detail & Related papers (2022-07-21T07:35:18Z) - A Probabilistic Generative Model of Free Categories [1.7679374058425343]
This paper defines a probabilistic generative model of morphisms in free monoidal categories over domain-specific generating objects and morphisms.
Acyclic wiring diagrams can model specifications for morphisms, which the model can use to generate morphisms.
A concrete experiment shows that the free category achieves prior competitive reconstruction performance on the Omniglot dataset.
arXiv Detail & Related papers (2022-05-09T20:35:08Z) - Discovering Relationships between Object Categories via Universal
Canonical Maps [80.07703460198198]
We tackle the problem of learning the geometry of multiple categories of deformable objects jointly.
Recent work has shown that it is possible to learn a unified dense pose predictor for several categories of related objects.
We show that improved correspondences can be learned automatically as a natural byproduct of learning category-specific dense pose predictors.
arXiv Detail & Related papers (2021-06-17T18:38:18Z) - Argmax Flows and Multinomial Diffusion: Towards Non-Autoregressive
Language Models [76.22217735434661]
This paper introduces two new classes of generative models for categorical data: Argmax Flows and Multinomial Diffusion.
We demonstrate that our models perform competitively on language modelling and modelling of image segmentation maps.
arXiv Detail & Related papers (2021-02-10T11:04:17Z) - Pairwise Supervision Can Provably Elicit a Decision Boundary [84.58020117487898]
Similarity learning is a problem to elicit useful representations by predicting the relationship between a pair of patterns.
We show that similarity learning is capable of solving binary classification by directly eliciting a decision boundary.
arXiv Detail & Related papers (2020-06-11T05:35:16Z) - An Epistemic Approach to the Formal Specification of Statistical Machine
Learning [1.599072005190786]
We introduce a formal model for supervised learning based on a Kripke model.
We then formalize various notions of the classification performance, robustness, and fairness of statistical classifiers.
arXiv Detail & Related papers (2020-04-27T12:16:45Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.