A Concise Mathematical Description of Active Inference in Discrete Time
- URL: http://arxiv.org/abs/2406.07726v2
- Date: Wed, 25 Sep 2024 17:59:18 GMT
- Title: A Concise Mathematical Description of Active Inference in Discrete Time
- Authors: Jesse van Oostrum, Carlotta Langer, Nihat Ay,
- Abstract summary: The main part of the paper serves as a basic introduction to the topic, including a detailed example illustrating the theory on action selection.
In the appendix the more subtle mathematical details are discussed.
This part is aimed at readers who have already studied the active inference literature but struggle to make sense of the mathematical details and derivations.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: In this paper we present a concise mathematical description of active inference in discrete time. The main part of the paper serves as a basic introduction to the topic, including a detailed example illustrating the theory on action selection. In the appendix the more subtle mathematical details are discussed. This part is aimed at readers who have already studied the active inference literature but struggle to make sense of the mathematical details and derivations. Throughout the whole manuscript, special attention has been paid to adopting notation that is both precise and in line with standard mathematical texts. All equations and derivations are linked to specific equation numbers in other popular text on the topic. Furthermore, Python code is provided that implements the action selection mechanism described in this paper and is compatible with pymdp environments.
Related papers
- The Weak Form Is Stronger Than You Think [0.0]
The weak form is a well-studied, widely-utilized mathematical tool.
Recent advances in weak form versions of equation learning, parameter estimation, and coarse graining offer surprising noise robustness, accuracy, and computational efficiency.
arXiv Detail & Related papers (2024-09-10T13:59:17Z) - Machine learning and information theory concepts towards an AI
Mathematician [77.63761356203105]
The current state-of-the-art in artificial intelligence is impressive, especially in terms of mastery of language, but not so much in terms of mathematical reasoning.
This essay builds on the idea that current deep learning mostly succeeds at system 1 abilities.
It takes an information-theoretical posture to ask questions about what constitutes an interesting mathematical statement.
arXiv Detail & Related papers (2024-03-07T15:12:06Z) - Inference via Interpolation: Contrastive Representations Provably Enable Planning and Inference [110.47649327040392]
Given time series data, how can we answer questions like "what will happen in the future?" and "how did we get here?"
We show how these questions can have compact, closed form solutions in terms of learned representations.
arXiv Detail & Related papers (2024-03-06T22:27:30Z) - Abstraction boundaries and spec driven development in pure mathematics [0.0]
In this article we discuss how abstraction boundaries can help tame complexity in mathematical research.
We argue that the use of an interactive theorem prover introduces additional qualitative benefits in the implementation of these ideas.
arXiv Detail & Related papers (2023-09-26T11:59:32Z) - Towards a Holistic Understanding of Mathematical Questions with
Contrastive Pre-training [65.10741459705739]
We propose a novel contrastive pre-training approach for mathematical question representations, namely QuesCo.
We first design two-level question augmentations, including content-level and structure-level, which generate literally diverse question pairs with similar purposes.
Then, to fully exploit hierarchical information of knowledge concepts, we propose a knowledge hierarchy-aware rank strategy.
arXiv Detail & Related papers (2023-01-18T14:23:29Z) - Spectral theorem for dummies: A pedagogical discussion on quantum
probability and random variable theory [0.0]
John von Neumann's spectral theorem for self-adjoint operators is a cornerstone of quantum mechanics.
This paper presents a plain-spoken formulation of this theorem in terms of Dirac's bra and ket notation.
arXiv Detail & Related papers (2022-11-23T07:05:47Z) - Semantic Representations of Mathematical Expressions in a Continuous
Vector Space [0.0]
This work describes an approach for representing mathematical expressions in a continuous vector space.
We use the encoder of a sequence-to-sequence architecture, trained on visually different but mathematically equivalent expressions, to generate vector representations.
arXiv Detail & Related papers (2022-10-08T22:33:39Z) - JiuZhang: A Chinese Pre-trained Language Model for Mathematical Problem
Understanding [74.12405417718054]
This paper aims to advance the mathematical intelligence of machines by presenting the first Chinese mathematical pre-trained language model(PLM)
Unlike other standard NLP tasks, mathematical texts are difficult to understand, since they involve mathematical terminology, symbols and formulas in the problem statement.
We design a novel curriculum pre-training approach for improving the learning of mathematical PLMs, consisting of both basic and advanced courses.
arXiv Detail & Related papers (2022-06-13T17:03:52Z) - Self-adjoint extension schemes and modern applications to quantum
Hamiltonians [55.2480439325792]
monograph contains revised and enlarged materials from previous lecture notes of undergraduate and graduate courses and seminars delivered by both authors over the last years on a subject that is central both in abstract operator theory and in applications to quantum mechanics.
A number of models are discussed, which are receiving today new or renewed interest in mathematical physics, in particular from the point of view of realising certain operators of interests self-adjointly.
arXiv Detail & Related papers (2022-01-25T09:45:16Z) - Fact-driven Logical Reasoning for Machine Reading Comprehension [82.58857437343974]
We are motivated to cover both commonsense and temporary knowledge clues hierarchically.
Specifically, we propose a general formalism of knowledge units by extracting backbone constituents of the sentence.
We then construct a supergraph on top of the fact units, allowing for the benefit of sentence-level (relations among fact groups) and entity-level interactions.
arXiv Detail & Related papers (2021-05-21T13:11:13Z) - Natural Language Premise Selection: Finding Supporting Statements for
Mathematical Text [3.42658286826597]
We propose a new NLP task, the natural premise selection, which is used to retrieve supporting definitions and supporting propositions.
We also make available a dataset, NL-PS, which can be used to evaluate different approaches for the natural premise selection task.
arXiv Detail & Related papers (2020-04-30T17:08:03Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.