Universal Learning of Stochastic Dynamics for Exact Belief Propagation using Bernstein Normalizing Flows
- URL: http://arxiv.org/abs/2509.15533v1
- Date: Fri, 19 Sep 2025 02:36:35 GMT
- Title: Universal Learning of Stochastic Dynamics for Exact Belief Propagation using Bernstein Normalizing Flows
- Authors: Peter Amorese, Morteza Lahijanian,
- Abstract summary: This paper establishes the theoretical foundations for a class of models that satisfy both properties.<n>The proposed approach combines the expressiveness of normalizing density estimation with the analytical tractability of Bernsteins flows.
- Score: 10.143540866021542
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Predicting the distribution of future states in a stochastic system, known as belief propagation, is fundamental to reasoning under uncertainty. However, nonlinear dynamics often make analytical belief propagation intractable, requiring approximate methods. When the system model is unknown and must be learned from data, a key question arises: can we learn a model that (i) universally approximates general nonlinear stochastic dynamics, and (ii) supports analytical belief propagation? This paper establishes the theoretical foundations for a class of models that satisfy both properties. The proposed approach combines the expressiveness of normalizing flows for density estimation with the analytical tractability of Bernstein polynomials. Empirical results show the efficacy of our learned model over state-of-the-art data-driven methods for belief propagation, especially for highly non-linear systems with non-additive, non-Gaussian noise.
Related papers
- Latent-Variable Learning of SPDEs via Wiener Chaos [2.0901018134712297]
We study the problem of learning the law of linear partial differential equations (SPDEs) with additive Gaussian forcing from observations.<n>Our approach combines a spectral Galerkin projection with a truncated Wiener chaos expansion, yielding a separation between evolution and forcing domains.<n>This reduces the infinite-dimensional deterministic SPDE to a finite system of parametrized ordinary differential equations governing latent temporal dynamics.
arXiv Detail & Related papers (2026-02-12T10:19:43Z) - Overcoming Dimensional Factorization Limits in Discrete Diffusion Models through Quantum Joint Distribution Learning [79.65014491424151]
We propose a quantum Discrete Denoising Diffusion Probabilistic Model (QD3PM)<n>It enables joint probability learning through diffusion and denoising in exponentially large Hilbert spaces.<n>This paper establishes a new theoretical paradigm in generative models by leveraging the quantum advantage in joint distribution learning.
arXiv Detail & Related papers (2025-05-08T11:48:21Z) - Dynamics of Open Quantum Systems with Initial System-Environment Correlations via Stochastic Unravelings [0.0]
In open quantum systems, the reduced dynamics is described starting from the assumption that the system and the environment are initially uncorrelated.<n>For the uncorrelated scenario, unravelings are a powerful tool to simulate the dynamics, but so far they have not been used in the most general case in which correlations are initially present.<n>In our work, we employ the bath positive (B+) or one-sided positive decomposition formalism as a starting point to generalize unraveling in the presence of initial correlations.
arXiv Detail & Related papers (2025-02-18T12:26:32Z) - Bayesian Inference for Consistent Predictions in Overparameterized Nonlinear Regression [0.0]
This study explores the predictive properties of over parameterized nonlinear regression within the Bayesian framework.
Posterior contraction is established for generalized linear and single-neuron models with Lipschitz continuous activation functions.
The proposed method was validated via numerical simulations and a real data application.
arXiv Detail & Related papers (2024-04-06T04:22:48Z) - Gaussian process learning of nonlinear dynamics [0.0]
We propose a new method that learns nonlinear dynamics through a Bayesian inference of characterizing model parameters.
We will discuss the applicability of the proposed method to several typical scenarios for dynamical systems.
arXiv Detail & Related papers (2023-12-19T14:27:26Z) - Causal Modeling with Stationary Diffusions [89.94899196106223]
We learn differential equations whose stationary densities model a system's behavior under interventions.
We show that they generalize to unseen interventions on their variables, often better than classical approaches.
Our inference method is based on a new theoretical result that expresses a stationarity condition on the diffusion's generator in a reproducing kernel Hilbert space.
arXiv Detail & Related papers (2023-10-26T14:01:17Z) - Capturing dynamical correlations using implicit neural representations [85.66456606776552]
We develop an artificial intelligence framework which combines a neural network trained to mimic simulated data from a model Hamiltonian with automatic differentiation to recover unknown parameters from experimental data.
In doing so, we illustrate the ability to build and train a differentiable model only once, which then can be applied in real-time to multi-dimensional scattering data.
arXiv Detail & Related papers (2023-04-08T07:55:36Z) - Leveraging Global Parameters for Flow-based Neural Posterior Estimation [90.21090932619695]
Inferring the parameters of a model based on experimental observations is central to the scientific method.
A particularly challenging setting is when the model is strongly indeterminate, i.e., when distinct sets of parameters yield identical observations.
We present a method for cracking such indeterminacy by exploiting additional information conveyed by an auxiliary set of observations sharing global parameters.
arXiv Detail & Related papers (2021-02-12T12:23:13Z) - Nonlinear Independent Component Analysis for Continuous-Time Signals [85.59763606620938]
We study the classical problem of recovering a multidimensional source process from observations of mixtures of this process.
We show that this recovery is possible for many popular models of processes (up to order and monotone scaling of their coordinates) if the mixture is given by a sufficiently differentiable, invertible function.
arXiv Detail & Related papers (2021-02-04T20:28:44Z) - Multiplicative noise and heavy tails in stochastic optimization [62.993432503309485]
empirical optimization is central to modern machine learning, but its role in its success is still unclear.
We show that it commonly arises in parameters of discrete multiplicative noise due to variance.
A detailed analysis is conducted in which we describe on key factors, including recent step size, and data, all exhibit similar results on state-of-the-art neural network models.
arXiv Detail & Related papers (2020-06-11T09:58:01Z) - Bayesian differential programming for robust systems identification
under uncertainty [14.169588600819546]
This paper presents a machine learning framework for Bayesian systems identification from noisy, sparse and irregular observations of nonlinear dynamical systems.
The proposed method takes advantage of recent developments in differentiable programming to propagate gradient information through ordinary differential equation solvers.
The use of sparsity-promoting priors enables the discovery of interpretable and parsimonious representations for the underlying latent dynamics.
arXiv Detail & Related papers (2020-04-15T00:51:14Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.