Learning minimal representations of stochastic processes with
variational autoencoders
- URL: http://arxiv.org/abs/2307.11608v2
- Date: Fri, 4 Aug 2023 12:40:59 GMT
- Title: Learning minimal representations of stochastic processes with
variational autoencoders
- Authors: Gabriel Fern\'andez-Fern\'andez, Carlo Manzo, Maciej Lewenstein,
Alexandre Dauphin, Gorka Mu\~noz-Gil
- Abstract summary: We introduce an unsupervised machine learning approach to determine the minimal set of parameters required to describe a process.
Our approach enables for the autonomous discovery of unknown parameters describing processes.
- Score: 52.99137594502433
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Stochastic processes have found numerous applications in science, as they are
broadly used to model a variety of natural phenomena. Due to their intrinsic
randomness and uncertainty, they are however difficult to characterize. Here,
we introduce an unsupervised machine learning approach to determine the minimal
set of parameters required to effectively describe the dynamics of a stochastic
process. Our method builds upon an extended $\beta$-variational autoencoder
architecture. By means of simulated datasets corresponding to paradigmatic
diffusion models, we showcase its effectiveness in extracting the minimal
relevant parameters that accurately describe these dynamics. Furthermore, the
method enables the generation of new trajectories that faithfully replicate the
expected stochastic behavior. Overall, our approach enables for the autonomous
discovery of unknown parameters describing stochastic processes, hence
enhancing our comprehension of complex phenomena across various fields.
Related papers
- Towards Learning Stochastic Population Models by Gradient Descent [0.0]
We show that simultaneous estimation of parameters and structure poses major challenges for optimization procedures.
We demonstrate accurate estimation of models but find that enforcing the inference of parsimonious, interpretable models drastically increases the difficulty.
arXiv Detail & Related papers (2024-04-10T14:38:58Z) - Stochastic parameter reduced-order model based on hybrid machine learning approaches [4.378407481656902]
This paper constructs a Convolutional Autoencoder-Reservoir Computing-Normalizing Flow algorithm framework.
The framework is used to characterize the evolution of latent state variables.
In this way, a data-driven reduced-order model is constructed to describe the complex system and its dynamic behavior.
arXiv Detail & Related papers (2024-03-24T06:52:37Z) - Capturing dynamical correlations using implicit neural representations [85.66456606776552]
We develop an artificial intelligence framework which combines a neural network trained to mimic simulated data from a model Hamiltonian with automatic differentiation to recover unknown parameters from experimental data.
In doing so, we illustrate the ability to build and train a differentiable model only once, which then can be applied in real-time to multi-dimensional scattering data.
arXiv Detail & Related papers (2023-04-08T07:55:36Z) - Latent Variable Representation for Reinforcement Learning [131.03944557979725]
It remains unclear theoretically and empirically how latent variable models may facilitate learning, planning, and exploration to improve the sample efficiency of model-based reinforcement learning.
We provide a representation view of the latent variable models for state-action value functions, which allows both tractable variational learning algorithm and effective implementation of the optimism/pessimism principle.
In particular, we propose a computationally efficient planning algorithm with UCB exploration by incorporating kernel embeddings of latent variable models.
arXiv Detail & Related papers (2022-12-17T00:26:31Z) - A Causality-Based Learning Approach for Discovering the Underlying
Dynamics of Complex Systems from Partial Observations with Stochastic
Parameterization [1.2882319878552302]
This paper develops a new iterative learning algorithm for complex turbulent systems with partial observations.
It alternates between identifying model structures, recovering unobserved variables, and estimating parameters.
Numerical experiments show that the new algorithm succeeds in identifying the model structure and providing suitable parameterizations for many complex nonlinear systems.
arXiv Detail & Related papers (2022-08-19T00:35:03Z) - MACE: An Efficient Model-Agnostic Framework for Counterfactual
Explanation [132.77005365032468]
We propose a novel framework of Model-Agnostic Counterfactual Explanation (MACE)
In our MACE approach, we propose a novel RL-based method for finding good counterfactual examples and a gradient-less descent method for improving proximity.
Experiments on public datasets validate the effectiveness with better validity, sparsity and proximity.
arXiv Detail & Related papers (2022-05-31T04:57:06Z) - On Contrastive Representations of Stochastic Processes [53.21653429290478]
Learning representations of processes is an emerging problem in machine learning.
We show that our methods are effective for learning representations of periodic functions, 3D objects and dynamical processes.
arXiv Detail & Related papers (2021-06-18T11:00:24Z) - ImitationFlow: Learning Deep Stable Stochastic Dynamic Systems by
Normalizing Flows [29.310742141970394]
We introduce ImitationFlow, a novel Deep generative model that allows learning complex globally stable, nonlinear dynamics.
We show the effectiveness of our method with both standard datasets and a real robot experiment.
arXiv Detail & Related papers (2020-10-25T14:49:46Z) - Multiplicative noise and heavy tails in stochastic optimization [62.993432503309485]
empirical optimization is central to modern machine learning, but its role in its success is still unclear.
We show that it commonly arises in parameters of discrete multiplicative noise due to variance.
A detailed analysis is conducted in which we describe on key factors, including recent step size, and data, all exhibit similar results on state-of-the-art neural network models.
arXiv Detail & Related papers (2020-06-11T09:58:01Z) - Automatic Differentiation and Continuous Sensitivity Analysis of Rigid
Body Dynamics [15.565726546970678]
We introduce a differentiable physics simulator for rigid body dynamics.
In the context of trajectory optimization, we introduce a closed-loop model-predictive control algorithm.
arXiv Detail & Related papers (2020-01-22T03:54:00Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.