Predict-Project-Renoise: Sampling Diffusion Models under Hard Constraints
- URL: http://arxiv.org/abs/2601.21033v1
- Date: Wed, 28 Jan 2026 20:50:19 GMT
- Title: Predict-Project-Renoise: Sampling Diffusion Models under Hard Constraints
- Authors: Omer Rochman-Sharabi, Gilles Louppe,
- Abstract summary: We introduce a constrained sampling framework that enforces hard constraints, such as physical laws or observational consistency, at generation time.<n>Our approach defines a constrained forward process that diffuses only over the feasible set of constraint-satisfying samples, inducing constrained marginal distributions.<n>We propose Predict-Project-Renoise (PPR), an iterative algorithm that samples from the constrained marginals by alternating between denoising predictions, projecting onto the feasible set, and renoising.
- Score: 5.539946449743145
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Neural emulators based on diffusion models show promise for scientific applications, but vanilla models cannot guarantee physical accuracy or constraint satisfaction. We address this by introducing a constrained sampling framework that enforces hard constraints, such as physical laws or observational consistency, at generation time. Our approach defines a constrained forward process that diffuses only over the feasible set of constraint-satisfying samples, inducing constrained marginal distributions. To reverse this, we propose Predict-Project-Renoise (PPR), an iterative algorithm that samples from the constrained marginals by alternating between denoising predictions, projecting onto the feasible set, and renoising. Experiments on 2D distributions, PDEs, and global weather forecasting demonstrate that PPR reduces constraint violations by over an order of magnitude while improving sample consistency and better matching the true constrained distribution compared to baselines.
Related papers
- Sharp Convergence Rates for Masked Diffusion Models [53.117058231393834]
We develop a total-variation based analysis for the Euler method that overcomes limitations.<n>Our results relax assumptions on score estimation, improve parameter dependencies, and establish convergence guarantees.<n>Overall, our analysis introduces a direct TV-based error decomposition along the CTMC trajectory and a decoupling-based path-wise analysis for FHS.
arXiv Detail & Related papers (2026-02-26T00:47:51Z) - Learnable Chernoff Baselines for Inference-Time Alignment [64.81256817158851]
We introduce Learnable Chernoff Baselines as a method for efficiently and approximately sampling from exponentially tilted kernels.<n>We establish total-variation guarantees to the ideal aligned model, and demonstrate in both continuous and discrete diffusion settings that LCB sampling closely matches ideal rejection sampling.
arXiv Detail & Related papers (2026-02-08T00:09:40Z) - Flow-Based Conformal Predictive Distributions [1.2691047660244335]
We show that a differentiable nonconformity score induces a deterministic flow on the output space whose trajectories converge to the boundary of the corresponding conformal prediction set.<n>This leads to a computationally efficient, training-free method for sampling conformal boundaries in arbitrary dimensions.<n>We evaluate the approach on PDE inverse problems, precipitation downscaling, climate model debiasing, and hurricane trajectory forecasting.
arXiv Detail & Related papers (2026-02-07T17:26:50Z) - Conditional Diffusion Guidance under Hard Constraint: A Stochastic Analysis Approach [7.504703549763421]
We study conditional generation in diffusion models under hard constraints, where generated samples must satisfy prescribed events with probability one.<n>We develop a principled conditional diffusion guidance framework based on Doob's h-transform, martingale representation and quadratic variation process.<n>We provide non-asymptotic guarantees for the resulting conditional sampler in both total variation and Wasserstein distances.
arXiv Detail & Related papers (2026-02-05T10:46:20Z) - DistDF: Time-Series Forecasting Needs Joint-Distribution Wasserstein Alignment [92.70019102733453]
Training time-series forecast models requires aligning the conditional distribution of model forecasts with that of the label sequence.<n>We propose DistDF, which achieves alignment by alternatively minimizing a discrepancy between the conditional forecast and label distributions.
arXiv Detail & Related papers (2025-10-28T16:09:59Z) - Chance-constrained Flow Matching for High-Fidelity Constraint-aware Generation [46.932479632530764]
Chance-constrained Flow Matching integrates optimization into the sampling process, enabling effective enforcement of hard constraints.<n>Experiments show that CCFM outperforms current state-of-the-art constrained generative models in modeling complex physical systems.
arXiv Detail & Related papers (2025-09-29T17:56:52Z) - Inference-Time Scaling of Diffusion Language Models with Particle Gibbs Sampling [70.8832906871441]
We study how to steer generation toward desired rewards without retraining the models.<n>Prior methods typically resample or filter within a single denoising trajectory, optimizing rewards step-by-step without trajectory-level refinement.<n>We introduce particle Gibbs sampling for diffusion language models (PG-DLM), a novel inference-time algorithm enabling trajectory-level refinement while preserving generation perplexity.
arXiv Detail & Related papers (2025-07-11T08:00:47Z) - Minimax Optimality of the Probability Flow ODE for Diffusion Models [8.15094483029656]
This work develops the first end-to-end theoretical framework for deterministic ODE-based samplers.<n>We propose a smooth regularized score estimator that simultaneously controls both the $L2$ score error and the associated mean Jacobian error.<n>We demonstrate that the resulting sampler achieves the minimax rate in total variation distance, modulo logarithmic factors.
arXiv Detail & Related papers (2025-03-12T17:51:29Z) - Theory on Score-Mismatched Diffusion Models and Zero-Shot Conditional Samplers [49.97755400231656]
We present the first performance guarantee with explicit dimensional dependencies for general score-mismatched diffusion samplers.<n>We show that score mismatches result in an distributional bias between the target and sampling distributions, proportional to the accumulated mismatch between the target and training distributions.<n>This result can be directly applied to zero-shot conditional samplers for any conditional model, irrespective of measurement noise.
arXiv Detail & Related papers (2024-10-17T16:42:12Z) - Controllable Generation via Locally Constrained Resampling [77.48624621592523]
We propose a tractable probabilistic approach that performs Bayesian conditioning to draw samples subject to a constraint.
Our approach considers the entire sequence, leading to a more globally optimal constrained generation than current greedy methods.
We show that our approach is able to steer the model's outputs away from toxic generations, outperforming similar approaches to detoxification.
arXiv Detail & Related papers (2024-10-17T00:49:53Z) - Physics-Informed Diffusion Models [0.0]
We present a framework that unifies generative modeling and partial differential equation fulfillment.<n>Our approach reduces the residual error by up to two orders of magnitude compared to previous work in a fluid flow case study.
arXiv Detail & Related papers (2024-03-21T13:52:55Z) - Generative Modeling with Denoising Auto-Encoders and Langevin Sampling [88.83704353627554]
We show that both DAE and DSM provide estimates of the score of the smoothed population density.
We then apply our results to the homotopy method of arXiv:1907.05600 and provide theoretical justification for its empirical success.
arXiv Detail & Related papers (2020-01-31T23:50:03Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.