Gradient-Free Generation for Hard-Constrained Systems
- URL: http://arxiv.org/abs/2412.01786v2
- Date: Mon, 03 Mar 2025 23:56:51 GMT
- Title: Gradient-Free Generation for Hard-Constrained Systems
- Authors: Chaoran Cheng, Boran Han, Danielle C. Maddix, Abdul Fatir Ansari, Andrew Stuart, Michael W. Mahoney, Yuyang Wang,
- Abstract summary: Existing constrained generative models rely heavily on gradient information, which is often sparse or computationally expensive in some fields.<n>We introduce a novel framework for adapting pre-trained, unconstrained flow-matching models to satisfy constraints exactly in a zero-shot manner.
- Score: 41.558608119074755
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Generative models that satisfy hard constraints are critical in many scientific and engineering applications, where physical laws or system requirements must be strictly respected. Many existing constrained generative models, especially those developed for computer vision, rely heavily on gradient information, which is often sparse or computationally expensive in some fields, e.g., partial differential equations (PDEs). In this work, we introduce a novel framework for adapting pre-trained, unconstrained flow-matching models to satisfy constraints exactly in a zero-shot manner without requiring expensive gradient computations or fine-tuning. Our framework, ECI sampling, alternates between extrapolation (E), correction (C), and interpolation (I) stages during each iterative sampling step of flow matching sampling to ensure accurate integration of constraint information while preserving the validity of the generation. We demonstrate the effectiveness of our approach across various PDE systems, showing that ECI-guided generation strictly adheres to physical constraints and accurately captures complex distribution shifts induced by these constraints. Empirical results demonstrate that our framework consistently outperforms baseline approaches in various zero-shot constrained generation tasks and also achieves competitive results in the regression tasks without additional fine-tuning.
Related papers
- Stochastic and Non-local Closure Modeling for Nonlinear Dynamical Systems via Latent Score-based Generative Models [0.0]
We propose a latent score-based generative AI framework for learning, non-local closure models and laws in nonlinear dynamical systems.<n>This work addresses a key challenge of modeling complex multiscale dynamical systems without a clear scale separation.
arXiv Detail & Related papers (2025-06-25T19:04:02Z) - Physics-Constrained Flow Matching: Sampling Generative Models with Hard Constraints [0.6990493129893112]
Deep generative models have recently been applied to physical systems governed by partial differential equations (PDEs)<n>Existing methods often rely on soft penalties or architectural biases that fail to guarantee hard constraints.<n>We propose Physics-Constrained Flow Matching, a zero-shot inference framework that enforces arbitrary nonlinear constraints in pretrained flow-based generative models.
arXiv Detail & Related papers (2025-06-04T17:12:37Z) - Semi-Explicit Neural DAEs: Learning Long-Horizon Dynamical Systems with Algebraic Constraints [2.66269503676104]
We propose a method that explicitly enforces algebraic constraints by projecting each ODE step onto the constraint manifold.<n>PNODEs consistently outperform baselines across six benchmark problems achieving a mean constraint violation error below $10-10$.<n>These results show that constraint projection offers a simple strategy for learning physically consistent long-horizon dynamics.
arXiv Detail & Related papers (2025-05-26T20:31:15Z) - Generative Modeling of Random Fields from Limited Data via Constrained Latent Flow Matching [0.0]
Deep generative models are promising tools for science and engineering, but their reliance on abundant, high-quality data limits applicability.<n>We present a novel framework for generative modeling of random fields that incorporates domain knowledge to supplement limited, sparse, and indirect data.
arXiv Detail & Related papers (2025-05-19T11:47:44Z) - Paving the way for scientific foundation models: enhancing generalization and robustness in PDEs with constraint-aware pre-training [49.8035317670223]
A scientific foundation model (SciFM) is emerging as a promising tool for learning transferable representations across diverse domains.
We propose incorporating PDE residuals into pre-training either as the sole learning signal or in combination with data loss to compensate for limited or infeasible training data.
Our results show that pre-training with PDE constraints significantly enhances generalization, outperforming models trained solely on solution data.
arXiv Detail & Related papers (2025-03-24T19:12:39Z) - Constrained Discrete Diffusion [61.81569616239755]
This paper introduces Constrained Discrete Diffusion (CDD), a novel integration of differentiable constraint optimization within the diffusion process.<n>CDD directly imposes constraints into the discrete diffusion sampling process, resulting in a training-free and effective approach.
arXiv Detail & Related papers (2025-03-12T19:48:12Z) - Training-Free Constrained Generation With Stable Diffusion Models [45.138721047543214]
We propose a novel approach to integrate stable diffusion models with constrained optimization frameworks.
We demonstrate the effectiveness of this approach through material science experiments requiring adherence to precise morphometric properties.
arXiv Detail & Related papers (2025-02-08T16:11:17Z) - Efficient Fine-Tuning and Concept Suppression for Pruned Diffusion Models [93.76814568163353]
We propose a novel bilevel optimization framework for pruned diffusion models.
This framework consolidates the fine-tuning and unlearning processes into a unified phase.
It is compatible with various pruning and concept unlearning methods.
arXiv Detail & Related papers (2024-12-19T19:13:18Z) - On the Trajectory Regularity of ODE-based Diffusion Sampling [79.17334230868693]
Diffusion-based generative models use differential equations to establish a smooth connection between a complex data distribution and a tractable prior distribution.
In this paper, we identify several intriguing trajectory properties in the ODE-based sampling process of diffusion models.
arXiv Detail & Related papers (2024-05-18T15:59:41Z) - Constrained Synthesis with Projected Diffusion Models [47.56192362295252]
This paper introduces an approach to generative diffusion processes the ability to satisfy and certify compliance with constraints and physical principles.
The proposed method recast the traditional process of generative diffusion as a constrained distribution problem to ensure adherence to constraints.
arXiv Detail & Related papers (2024-02-05T22:18:16Z) - CoCoGen: Physically-Consistent and Conditioned Score-based Generative Models for Forward and Inverse Problems [1.0923877073891446]
This work extends the reach of generative models into physical problem domains.
We present an efficient approach to promote consistency with the underlying PDE.
We showcase the potential and versatility of score-based generative models in various physics tasks.
arXiv Detail & Related papers (2023-12-16T19:56:10Z) - Neural Fields with Hard Constraints of Arbitrary Differential Order [61.49418682745144]
We develop a series of approaches for enforcing hard constraints on neural fields.
The constraints can be specified as a linear operator applied to the neural field and its derivatives.
Our approaches are demonstrated in a wide range of real-world applications.
arXiv Detail & Related papers (2023-06-15T08:33:52Z) - Non-adversarial training of Neural SDEs with signature kernel scores [4.721845865189578]
State-of-the-art performance for irregular time series generation has been previously obtained by training these models adversarially as GANs.
In this paper, we introduce a novel class of scoring rules on pathspace based on signature kernels.
arXiv Detail & Related papers (2023-05-25T17:31:18Z) - Faster One-Sample Stochastic Conditional Gradient Method for Composite
Convex Minimization [61.26619639722804]
We propose a conditional gradient method (CGM) for minimizing convex finite-sum objectives formed as a sum of smooth and non-smooth terms.
The proposed method, equipped with an average gradient (SAG) estimator, requires only one sample per iteration. Nevertheless, it guarantees fast convergence rates on par with more sophisticated variance reduction techniques.
arXiv Detail & Related papers (2022-02-26T19:10:48Z) - A data-driven peridynamic continuum model for upscaling molecular
dynamics [3.1196544696082613]
We propose a learning framework to extract, from molecular dynamics data, an optimal Linear Peridynamic Solid model.
We provide sufficient well-posedness conditions for discretized LPS models with sign-changing influence functions.
This framework guarantees that the resulting model is mathematically well-posed, physically consistent, and that it generalizes well to settings that are different from the ones used during training.
arXiv Detail & Related papers (2021-08-04T07:07:47Z) - Closed-form Continuous-Depth Models [99.40335716948101]
Continuous-depth neural models rely on advanced numerical differential equation solvers.
We present a new family of models, termed Closed-form Continuous-depth (CfC) networks, that are simple to describe and at least one order of magnitude faster.
arXiv Detail & Related papers (2021-06-25T22:08:51Z) - Large-scale Neural Solvers for Partial Differential Equations [48.7576911714538]
Solving partial differential equations (PDE) is an indispensable part of many branches of science as many processes can be modelled in terms of PDEs.
Recent numerical solvers require manual discretization of the underlying equation as well as sophisticated, tailored code for distributed computing.
We examine the applicability of continuous, mesh-free neural solvers for partial differential equations, physics-informed neural networks (PINNs)
We discuss the accuracy of GatedPINN with respect to analytical solutions -- as well as state-of-the-art numerical solvers, such as spectral solvers.
arXiv Detail & Related papers (2020-09-08T13:26:51Z) - A Survey of Constrained Gaussian Process Regression: Approaches and
Implementation Challenges [0.0]
We provide an overview of several classes of Gaussian process constraints, including positivity or bound constraints, monotonicity and convexity constraints, differential equation constraints, and boundary condition constraints.
We compare the strategies behind each approach as well as the differences in implementation, concluding with a discussion of the computational challenges introduced by constraints.
arXiv Detail & Related papers (2020-06-16T17:03:36Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.