Least Squares and Marginal Log-Likelihood Model Predictive Control using Normalizing Flows
- URL: http://arxiv.org/abs/2409.17632v2
- Date: Wed, 14 May 2025 07:02:59 GMT
- Title: Least Squares and Marginal Log-Likelihood Model Predictive Control using Normalizing Flows
- Authors: Eike Cramer,
- Abstract summary: This work proposes using conditional normalizing flows as discrete-time models to learn dynamics.<n>In a reactor study, the normalizing flow MPC reduces the setpoint error in open and closed-loop cases to half that of a nominal controller.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Real-world (bio)chemical processes often exhibit stochastic dynamics with non-trivial correlations and state-dependent fluctuations. Model predictive control (MPC) often must consider these fluctuations to achieve reliable performance. However, most process models simply add stationary noise terms to a deterministic prediction. This work proposes using conditional normalizing flows as discrete-time models to learn stochastic dynamics. Normalizing flows learn the probability density function (PDF) of the states explicitly, given prior states and control inputs. In addition to standard least squares (LSQ) objectives, this work derives a marginal log-likelihood (MLL) objective based on the explicit PDF and Markov chain simulations. In a reactor study, the normalizing flow MPC reduces the setpoint error in open and closed-loop cases to half that of a nominal controller. Furthermore, the chance constraints lead to fewer constraint violations than the nominal controller. The MLL objective yields slightly more stable results than the LSQ, particularly for small scenario sets.
Related papers
- Closing the Loop: A Control-Theoretic Framework for Provably Stable Time Series Forecasting with LLMs [22.486083545585984]
Large Language Models (LLMs) have recently shown exceptional potential in time series forecasting.<n>Existing approaches typically employ a naive autoregressive generation strategy.<n>We propose textbfF-LLM, a novel closed-loop framework.
arXiv Detail & Related papers (2026-02-13T09:35:12Z) - FALCON: Few-step Accurate Likelihoods for Continuous Flows [78.37361800856583]
We propose Few-step Accurate Likelihoods for Continuous Flows (FALCON), which allows for few-step sampling with a likelihood accurate enough for importance sampling applications.<n>We show FALCON outperforms state-of-the-art normalizing flow models for molecular Boltzmann sampling and is two orders of magnitude faster than the equivalently performing CNF model.
arXiv Detail & Related papers (2025-12-10T18:47:25Z) - Flipping Against All Odds: Reducing LLM Coin Flip Bias via Verbalized Rejection Sampling [59.133428586090226]
Large language models (LLMs) can often accurately describe probability distributions using natural language.<n>This mismatch limits their use in tasks requiring reliableity, such as Monte Carlo methods, agent-based simulations, and randomized decision-making.<n>We introduce Verbalized Rejection Sampling (VRS), a natural-language adaptation of classical rejection sampling.
arXiv Detail & Related papers (2025-06-11T17:59:58Z) - Physics-aware generative models for turbulent fluid flows through energy-consistent stochastic interpolants [0.0]
Generative models have demonstrated remarkable success in domains such as text, image, and video.
In this work, we explore the application of generative models to fluid dynamics, specifically for turbulence simulation.
We propose a novel generative model based on interpolants, which enables probabilistic forecasting while incorporating physical constraints.
arXiv Detail & Related papers (2025-04-08T09:29:01Z) - Gaussian Mixture Flow Matching Models [51.976452482535954]
Diffusion models approximate the denoising distribution as a Gaussian and predict its mean, whereas flow matching models re parameterize the Gaussian mean as flow velocity.<n>They underperform in few-step sampling due to discretization error and tend to produce over-saturated colors under classifier-free guidance (CFG)<n>We introduce a novel probabilistic guidance scheme that mitigates the over-saturation issues of CFG and improves image generation quality.
arXiv Detail & Related papers (2025-04-07T17:59:42Z) - Asymptotically Optimal Change Detection for Unnormalized Pre- and Post-Change Distributions [65.38208224389027]
This paper addresses the problem of detecting changes when only unnormalized pre- and post-change distributions are accessible.
Our approach is based on the estimation of the Cumulative Sum statistics, which is known to produce optimal performance.
arXiv Detail & Related papers (2024-10-18T17:13:29Z) - Marginalization Consistent Probabilistic Forecasting of Irregular Time Series via Mixture of Separable flows [4.489135297410294]
Probabilistic forecasting models for joint distributions of targets in irregular time series with missing values are a heavily under-researched area in machine learning.<n>We propose MOSES (Marginalization Consistent Mixture of Separable Flows), a model that parametrizes a mixture of several latent Gaussian processes combined with separable uni- normality flows.<n>Experiments on four datasets show that MOSES achieves both accurate joint and marginal predictions, surpassing all other marginalization consistent baselines, while only trailing slightly behind ProFITi in joint prediction, but vastly superior when predicting marginal distributions.
arXiv Detail & Related papers (2024-06-11T13:28:43Z) - Time-changed normalizing flows for accurate SDE modeling [5.402030962296633]
We propose a novel transformation of dynamic normalizing flows, based on time deformation of a Brownian motion.
This approach enables us to effectively model some SDEs, that cannot be modeled otherwise.
arXiv Detail & Related papers (2023-12-22T13:57:29Z) - Data-driven Modeling and Inference for Bayesian Gaussian Process ODEs
via Double Normalizing Flows [28.62579476863723]
We introduce normalizing flows to re parameterize the ODE vector field, resulting in a data-driven prior distribution.
We also apply normalizing flows to the posterior inference of GP ODEs to resolve the issue of strong mean-field assumptions.
We validate the effectiveness of our approach on simulated dynamical systems and real-world human motion data.
arXiv Detail & Related papers (2023-09-17T09:28:47Z) - When Does Confidence-Based Cascade Deferral Suffice? [69.28314307469381]
Cascades are a classical strategy to enable inference cost to vary adaptively across samples.
A deferral rule determines whether to invoke the next classifier in the sequence, or to terminate prediction.
Despite being oblivious to the structure of the cascade, confidence-based deferral often works remarkably well in practice.
arXiv Detail & Related papers (2023-07-06T04:13:57Z) - User-defined Event Sampling and Uncertainty Quantification in Diffusion
Models for Physical Dynamical Systems [49.75149094527068]
We show that diffusion models can be adapted to make predictions and provide uncertainty quantification for chaotic dynamical systems.
We develop a probabilistic approximation scheme for the conditional score function which converges to the true distribution as the noise level decreases.
We are able to sample conditionally on nonlinear userdefined events at inference time, and matches data statistics even when sampling from the tails of the distribution.
arXiv Detail & Related papers (2023-06-13T03:42:03Z) - Locality-constrained autoregressive cum conditional normalizing flow for
lattice field theory simulations [0.0]
Local action integral leads to simplifications to the input domain of conditional normalizing flows.
We find that the autocorrelation times of l-ACNF models outperform an equivalent normalizing flow model on the full lattice by orders of magnitude.
arXiv Detail & Related papers (2023-04-04T13:55:51Z) - Improving and generalizing flow-based generative models with minibatch
optimal transport [90.01613198337833]
We introduce the generalized conditional flow matching (CFM) technique for continuous normalizing flows (CNFs)
CFM features a stable regression objective like that used to train the flow in diffusion models but enjoys the efficient inference of deterministic flow models.
A variant of our objective is optimal transport CFM (OT-CFM), which creates simpler flows that are more stable to train and lead to faster inference.
arXiv Detail & Related papers (2023-02-01T14:47:17Z) - Stabilizing Machine Learning Prediction of Dynamics: Noise and
Noise-inspired Regularization [58.720142291102135]
Recent has shown that machine learning (ML) models can be trained to accurately forecast the dynamics of chaotic dynamical systems.
In the absence of mitigating techniques, this technique can result in artificially rapid error growth, leading to inaccurate predictions and/or climate instability.
We introduce Linearized Multi-Noise Training (LMNT), a regularization technique that deterministically approximates the effect of many small, independent noise realizations added to the model input during training.
arXiv Detail & Related papers (2022-11-09T23:40:52Z) - A Physics-informed Deep Learning Approach for Minimum Effort Stochastic
Control of Colloidal Self-Assembly [9.791617215182598]
The control objective is formulated in terms of steering the state PDFs from a prescribed initial probability measure towards a prescribed terminal probability measure with minimum control effort.
We derive the conditions of optimality for the associated optimal control problem.
The performance of the proposed solution is demonstrated via numerical simulations on a benchmark colloidal self-assembly problem.
arXiv Detail & Related papers (2022-08-19T07:01:57Z) - Hybrid Gaussian Process Modeling Applied to Economic Stochastic Model
Predictive Control of Batch Processes [0.0]
Plant models can often be determined from first principles, parts of the model are difficult to derive using physical laws alone.
This paper exploits GPs to model the parts of the dynamic system that are difficult to describe using first principles.
It is vital to account for this uncertainty in the control algorithm, to prevent constraint violations and performance deterioration.
arXiv Detail & Related papers (2021-08-14T00:01:42Z) - Flow-based Spatio-Temporal Structured Prediction of Motion Dynamics [21.24885597341643]
Conditional Flows (CNFs) are flexible generative models capable of representing complicated distributions with high dimensionality and interdimensional correlations.
We propose MotionFlow as a novel approach that autoregressively normalizes the output on the temporal input features.
We apply our method to different tasks, including prediction, motion prediction time series forecasting, and binary segmentation.
arXiv Detail & Related papers (2021-04-09T14:30:35Z) - Gaussian Process-based Min-norm Stabilizing Controller for
Control-Affine Systems with Uncertain Input Effects and Dynamics [90.81186513537777]
We propose a novel compound kernel that captures the control-affine nature of the problem.
We show that this resulting optimization problem is convex, and we call it Gaussian Process-based Control Lyapunov Function Second-Order Cone Program (GP-CLF-SOCP)
arXiv Detail & Related papers (2020-11-14T01:27:32Z) - Multiplicative noise and heavy tails in stochastic optimization [62.993432503309485]
empirical optimization is central to modern machine learning, but its role in its success is still unclear.
We show that it commonly arises in parameters of discrete multiplicative noise due to variance.
A detailed analysis is conducted in which we describe on key factors, including recent step size, and data, all exhibit similar results on state-of-the-art neural network models.
arXiv Detail & Related papers (2020-06-11T09:58:01Z) - SUMO: Unbiased Estimation of Log Marginal Probability for Latent
Variable Models [80.22609163316459]
We introduce an unbiased estimator of the log marginal likelihood and its gradients for latent variable models based on randomized truncation of infinite series.
We show that models trained using our estimator give better test-set likelihoods than a standard importance-sampling based approach for the same average computational cost.
arXiv Detail & Related papers (2020-04-01T11:49:30Z) - Modeling Continuous Stochastic Processes with Dynamic Normalizing Flows [40.9137348900942]
We propose a novel type of flow driven by a differential deformation of the Wiener process.
As a result, we obtain a rich time series model whose observable process inherits many of the appealing properties of its base process.
arXiv Detail & Related papers (2020-02-24T20:13:43Z) - Stochastic Normalizing Flows [2.323220706791067]
We show that normalizing flows can be used to learn the transformation of a simple prior distribution.
We derive an efficient training procedure by which both the sampler's and the flow's parameters can be optimized end-to-end.
We illustrate the representational power, sampling efficiency and correctness of SNFs on several benchmarks including applications to molecular sampling systems in equilibrium.
arXiv Detail & Related papers (2020-02-16T23:29:32Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.