Model-Free Stochastic Process Modeling and Optimization using Normalizing Flows
- URL: http://arxiv.org/abs/2409.17632v1
- Date: Thu, 26 Sep 2024 08:28:14 GMT
- Title: Model-Free Stochastic Process Modeling and Optimization using Normalizing Flows
- Authors: Eike Cramer,
- Abstract summary: This work proposes using conditional normalizing flows as discrete-time models to learn the dynamics of chemical processes.
The normalizing flow yields stable simulations over long time horizons and high-quality results in probabilistic and MPC formulation for open-loop control.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Real-world chemical processes often exhibit stochastic dynamics with non-trivial correlations and state-dependent fluctuations. However, most process models simply add stationary noise terms to a deterministic prediction, which can lead to inaccurate predictions. This work proposes using conditional normalizing flows as discrete-time models (DTMs) to learn the stochastic dynamics of chemical processes. Normalizing flows learn an explicit expression of the system states' probability density function (PDF) given prior states and control inputs. The resulting model naturally allows for formulating stochastic and probabilistic setpoint-tracking objectives and chance constraints. In applications to a continuous reactor and a reactor cascade, the normalizing flow yields stable simulations over long time horizons and high-quality results in stochastic and probabilistic MPC formulation for open-loop control. Furthermore, a chance-constrained optimization finds reliable startup controls for the reactor cascade with stochastic reactions. In conclusion, the conditional normalizing flow presents an excellent choice for modeling nonlinear stochastic dynamics.
Related papers
- Asymptotically Optimal Change Detection for Unnormalized Pre- and Post-Change Distributions [65.38208224389027]
This paper addresses the problem of detecting changes when only unnormalized pre- and post-change distributions are accessible.
Our approach is based on the estimation of the Cumulative Sum statistics, which is known to produce optimal performance.
arXiv Detail & Related papers (2024-10-18T17:13:29Z) - Time-changed normalizing flows for accurate SDE modeling [5.402030962296633]
We propose a novel transformation of dynamic normalizing flows, based on time deformation of a Brownian motion.
This approach enables us to effectively model some SDEs, that cannot be modeled otherwise.
arXiv Detail & Related papers (2023-12-22T13:57:29Z) - Data-driven Modeling and Inference for Bayesian Gaussian Process ODEs
via Double Normalizing Flows [28.62579476863723]
We introduce normalizing flows to re parameterize the ODE vector field, resulting in a data-driven prior distribution.
We also apply normalizing flows to the posterior inference of GP ODEs to resolve the issue of strong mean-field assumptions.
We validate the effectiveness of our approach on simulated dynamical systems and real-world human motion data.
arXiv Detail & Related papers (2023-09-17T09:28:47Z) - User-defined Event Sampling and Uncertainty Quantification in Diffusion
Models for Physical Dynamical Systems [49.75149094527068]
We show that diffusion models can be adapted to make predictions and provide uncertainty quantification for chaotic dynamical systems.
We develop a probabilistic approximation scheme for the conditional score function which converges to the true distribution as the noise level decreases.
We are able to sample conditionally on nonlinear userdefined events at inference time, and matches data statistics even when sampling from the tails of the distribution.
arXiv Detail & Related papers (2023-06-13T03:42:03Z) - Improving and generalizing flow-based generative models with minibatch
optimal transport [90.01613198337833]
We introduce the generalized conditional flow matching (CFM) technique for continuous normalizing flows (CNFs)
CFM features a stable regression objective like that used to train the flow in diffusion models but enjoys the efficient inference of deterministic flow models.
A variant of our objective is optimal transport CFM (OT-CFM), which creates simpler flows that are more stable to train and lead to faster inference.
arXiv Detail & Related papers (2023-02-01T14:47:17Z) - Stabilizing Machine Learning Prediction of Dynamics: Noise and
Noise-inspired Regularization [58.720142291102135]
Recent has shown that machine learning (ML) models can be trained to accurately forecast the dynamics of chaotic dynamical systems.
In the absence of mitigating techniques, this technique can result in artificially rapid error growth, leading to inaccurate predictions and/or climate instability.
We introduce Linearized Multi-Noise Training (LMNT), a regularization technique that deterministically approximates the effect of many small, independent noise realizations added to the model input during training.
arXiv Detail & Related papers (2022-11-09T23:40:52Z) - Flow-based Spatio-Temporal Structured Prediction of Motion Dynamics [21.24885597341643]
Conditional Flows (CNFs) are flexible generative models capable of representing complicated distributions with high dimensionality and interdimensional correlations.
We propose MotionFlow as a novel approach that autoregressively normalizes the output on the temporal input features.
We apply our method to different tasks, including prediction, motion prediction time series forecasting, and binary segmentation.
arXiv Detail & Related papers (2021-04-09T14:30:35Z) - Gaussian Process-based Min-norm Stabilizing Controller for
Control-Affine Systems with Uncertain Input Effects and Dynamics [90.81186513537777]
We propose a novel compound kernel that captures the control-affine nature of the problem.
We show that this resulting optimization problem is convex, and we call it Gaussian Process-based Control Lyapunov Function Second-Order Cone Program (GP-CLF-SOCP)
arXiv Detail & Related papers (2020-11-14T01:27:32Z) - Multiplicative noise and heavy tails in stochastic optimization [62.993432503309485]
empirical optimization is central to modern machine learning, but its role in its success is still unclear.
We show that it commonly arises in parameters of discrete multiplicative noise due to variance.
A detailed analysis is conducted in which we describe on key factors, including recent step size, and data, all exhibit similar results on state-of-the-art neural network models.
arXiv Detail & Related papers (2020-06-11T09:58:01Z) - Modeling Continuous Stochastic Processes with Dynamic Normalizing Flows [40.9137348900942]
We propose a novel type of flow driven by a differential deformation of the Wiener process.
As a result, we obtain a rich time series model whose observable process inherits many of the appealing properties of its base process.
arXiv Detail & Related papers (2020-02-24T20:13:43Z) - Stochastic Normalizing Flows [2.323220706791067]
We show that normalizing flows can be used to learn the transformation of a simple prior distribution.
We derive an efficient training procedure by which both the sampler's and the flow's parameters can be optimized end-to-end.
We illustrate the representational power, sampling efficiency and correctness of SNFs on several benchmarks including applications to molecular sampling systems in equilibrium.
arXiv Detail & Related papers (2020-02-16T23:29:32Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.