Chance-constrained Flow Matching for High-Fidelity Constraint-aware Generation
- URL: http://arxiv.org/abs/2509.25157v1
- Date: Mon, 29 Sep 2025 17:56:52 GMT
- Title: Chance-constrained Flow Matching for High-Fidelity Constraint-aware Generation
- Authors: Jinhao Liang, Yixuan Sun, Anirban Samaddar, Sandeep Madireddy, Ferdinando Fioretto,
- Abstract summary: Chance-constrained Flow Matching integrates optimization into the sampling process, enabling effective enforcement of hard constraints.<n>Experiments show that CCFM outperforms current state-of-the-art constrained generative models in modeling complex physical systems.
- Score: 46.932479632530764
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Generative models excel at synthesizing high-fidelity samples from complex data distributions, but they often violate hard constraints arising from physical laws or task specifications. A common remedy is to project intermediate samples onto the feasible set; however, repeated projection can distort the learned distribution and induce a mismatch with the data manifold. Thus, recent multi-stage procedures attempt to defer projection to clean samples during sampling, but they increase algorithmic complexity and accumulate errors across steps. This paper addresses these challenges by proposing a novel training-free method, Chance-constrained Flow Matching (CCFM), that integrates stochastic optimization into the sampling process, enabling effective enforcement of hard constraints while maintaining high-fidelity sample generation. Importantly, CCFM guarantees feasibility in the same manner as conventional repeated projection, yet, despite operating directly on noisy intermediate samples, it is theoretically equivalent to projecting onto the feasible set defined by clean samples. This yields a sampler that mitigates distributional distortion. Empirical experiments show that CCFM outperforms current state-of-the-art constrained generative models in modeling complex physical systems governed by partial differential equations and molecular docking problems, delivering higher feasibility and fidelity.
Related papers
- Sharp Convergence Rates for Masked Diffusion Models [53.117058231393834]
We develop a total-variation based analysis for the Euler method that overcomes limitations.<n>Our results relax assumptions on score estimation, improve parameter dependencies, and establish convergence guarantees.<n>Overall, our analysis introduces a direct TV-based error decomposition along the CTMC trajectory and a decoupling-based path-wise analysis for FHS.
arXiv Detail & Related papers (2026-02-26T00:47:51Z) - Corrected Samplers for Discrete Flow Models [36.348940136801296]
A line of recent work has studied samplers for discrete diffusion models, such as tau-leaping and Euler solver.<n>We establish non-asymptotic discretization error bounds for those samplers without any restriction on transition rates and source distributions.<n>We rigorously show that the location-corrected sampler has a lower complexity than existing parallel samplers.
arXiv Detail & Related papers (2026-01-30T03:53:22Z) - Predict-Project-Renoise: Sampling Diffusion Models under Hard Constraints [5.539946449743145]
We introduce a constrained sampling framework that enforces hard constraints, such as physical laws or observational consistency, at generation time.<n>Our approach defines a constrained forward process that diffuses only over the feasible set of constraint-satisfying samples, inducing constrained marginal distributions.<n>We propose Predict-Project-Renoise (PPR), an iterative algorithm that samples from the constrained marginals by alternating between denoising predictions, projecting onto the feasible set, and renoising.
arXiv Detail & Related papers (2026-01-28T20:50:19Z) - Combating Noisy Labels through Fostering Self- and Neighbor-Consistency [120.4394402099635]
Label noise is pervasive in various real-world scenarios, posing challenges in supervised deep learning.<n>We propose a noise-robust method named Jo-SNC (textbfJoint sample selection and model regularization based on textbfSelf- and textbfNeighbor-textbfConsistency)<n>We design a self-adaptive, data-driven thresholding scheme to adjust per-class selection thresholds.
arXiv Detail & Related papers (2026-01-19T07:55:29Z) - One-shot Conditional Sampling: MMD meets Nearest Neighbors [3.6831672200803993]
We introduce Conditional Generator using MMD (CGMMD), a novel framework for conditional sampling.<n>A key feature of CGMMD is its ability to produce conditional samples in a single forward pass of the generator.<n>We show that CGMMD performs competitively on synthetic tasks involving complex conditional densities.
arXiv Detail & Related papers (2025-09-29T21:04:50Z) - Constrained Discrete Diffusion [61.81569616239755]
This paper introduces Constrained Discrete Diffusion (CDD), a novel integration of differentiable constraint optimization within the diffusion process.<n>CDD directly imposes constraints into the discrete diffusion sampling process, resulting in a training-free and effective approach.
arXiv Detail & Related papers (2025-03-12T19:48:12Z) - Generative Uncertainty in Diffusion Models [17.06573336804057]
We propose a Bayesian framework for estimating generative uncertainty of synthetic samples.<n>We show that the proposed generative uncertainty effectively identifies poor-quality samples and significantly outperforms existing uncertainty-based methods.
arXiv Detail & Related papers (2025-02-28T10:56:39Z) - Constrained Synthesis with Projected Diffusion Models [47.56192362295252]
This paper introduces an approach to generative diffusion processes the ability to satisfy and certify compliance with constraints and physical principles.
The proposed method recast the traditional process of generative diffusion as a constrained distribution problem to ensure adherence to constraints.
arXiv Detail & Related papers (2024-02-05T22:18:16Z) - Semi-Implicit Denoising Diffusion Models (SIDDMs) [50.30163684539586]
Existing models such as Denoising Diffusion Probabilistic Models (DDPM) deliver high-quality, diverse samples but are slowed by an inherently high number of iterative steps.
We introduce a novel approach that tackles the problem by matching implicit and explicit factors.
We demonstrate that our proposed method obtains comparable generative performance to diffusion-based models and vastly superior results to models with a small number of sampling steps.
arXiv Detail & Related papers (2023-06-21T18:49:22Z) - Breaking the Spurious Causality of Conditional Generation via Fairness
Intervention with Corrective Sampling [77.15766509677348]
Conditional generative models often inherit spurious correlations from the training dataset.
This can result in label-conditional distributions that are imbalanced with respect to another latent attribute.
We propose a general two-step strategy to mitigate this issue.
arXiv Detail & Related papers (2022-12-05T08:09:33Z) - Selectively increasing the diversity of GAN-generated samples [8.980453507536017]
We propose a novel method to selectively increase the diversity of GAN-generated samples.
We show the superiority of our method in a synthetic benchmark as well as a real-life scenario simulating data from the Zero Degree Calorimeter of ALICE experiment in CERN.
arXiv Detail & Related papers (2022-07-04T16:27:06Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.