Hard Constraint Guided Flow Matching for Gradient-Free Generation of PDE Solutions
- URL: http://arxiv.org/abs/2412.01786v1
- Date: Mon, 02 Dec 2024 18:36:26 GMT
- Title: Hard Constraint Guided Flow Matching for Gradient-Free Generation of PDE Solutions
- Authors: Chaoran Cheng, Boran Han, Danielle C. Maddix, Abdul Fatir Ansari, Andrew Stuart, Michael W. Mahoney, Yuyang Wang,
- Abstract summary: We introduce a novel framework for adapting pre-trained, unconstrained flow-matching models to satisfy constraints exactly in a zero-shot manner without requiring expensive computations or fine-tuning.
Our framework, ECI sampling, alternates between extrapolation (E), correction (C) and Generative (I) stages to ensure accurate integration of constraint information while preserving the validity of the generation.
We demonstrate the effectiveness of our approach across various PDE systems, showing that ECI-guided generation strictly adheres to physical constraints and accurately captures complex distribution shifts induced by these constraints.
- Score: 41.558608119074755
- License:
- Abstract: Generative models that satisfy hard constraints are crucial in many scientific and engineering applications where physical laws or system requirements must be strictly respected. However, many existing constrained generative models, especially those developed for computer vision, rely heavily on gradient information, often sparse or computationally expensive in fields like partial differential equations (PDEs). In this work, we introduce a novel framework for adapting pre-trained, unconstrained flow-matching models to satisfy constraints exactly in a zero-shot manner without requiring expensive gradient computations or fine-tuning. Our framework, ECI sampling, alternates between extrapolation (E), correction (C), and interpolation (I) stages during each iterative sampling step of flow matching sampling to ensure accurate integration of constraint information while preserving the validity of the generation. We demonstrate the effectiveness of our approach across various PDE systems, showing that ECI-guided generation strictly adheres to physical constraints and accurately captures complex distribution shifts induced by these constraints. Empirical results demonstrate that our framework consistently outperforms baseline approaches in various zero-shot constrained generation tasks and also achieves competitive results in the regression tasks without additional fine-tuning.
Related papers
- Training-Free Constrained Generation With Stable Diffusion Models [45.138721047543214]
We propose a novel approach to integrate stable diffusion models with constrained optimization frameworks.
We demonstrate the effectiveness of this approach through material science experiments requiring adherence to precise morphometric properties.
arXiv Detail & Related papers (2025-02-08T16:11:17Z) - Efficient Fine-Tuning and Concept Suppression for Pruned Diffusion Models [93.76814568163353]
We propose a novel bilevel optimization framework for pruned diffusion models.
This framework consolidates the fine-tuning and unlearning processes into a unified phase.
It is compatible with various pruning and concept unlearning methods.
arXiv Detail & Related papers (2024-12-19T19:13:18Z) - On the Trajectory Regularity of ODE-based Diffusion Sampling [79.17334230868693]
Diffusion-based generative models use differential equations to establish a smooth connection between a complex data distribution and a tractable prior distribution.
In this paper, we identify several intriguing trajectory properties in the ODE-based sampling process of diffusion models.
arXiv Detail & Related papers (2024-05-18T15:59:41Z) - CoCoGen: Physically-Consistent and Conditioned Score-based Generative Models for Forward and Inverse Problems [1.0923877073891446]
This work extends the reach of generative models into physical problem domains.
We present an efficient approach to promote consistency with the underlying PDE.
We showcase the potential and versatility of score-based generative models in various physics tasks.
arXiv Detail & Related papers (2023-12-16T19:56:10Z) - Neural Fields with Hard Constraints of Arbitrary Differential Order [61.49418682745144]
We develop a series of approaches for enforcing hard constraints on neural fields.
The constraints can be specified as a linear operator applied to the neural field and its derivatives.
Our approaches are demonstrated in a wide range of real-world applications.
arXiv Detail & Related papers (2023-06-15T08:33:52Z) - Non-adversarial training of Neural SDEs with signature kernel scores [4.721845865189578]
State-of-the-art performance for irregular time series generation has been previously obtained by training these models adversarially as GANs.
In this paper, we introduce a novel class of scoring rules on pathspace based on signature kernels.
arXiv Detail & Related papers (2023-05-25T17:31:18Z) - Faster One-Sample Stochastic Conditional Gradient Method for Composite
Convex Minimization [61.26619639722804]
We propose a conditional gradient method (CGM) for minimizing convex finite-sum objectives formed as a sum of smooth and non-smooth terms.
The proposed method, equipped with an average gradient (SAG) estimator, requires only one sample per iteration. Nevertheless, it guarantees fast convergence rates on par with more sophisticated variance reduction techniques.
arXiv Detail & Related papers (2022-02-26T19:10:48Z) - Score-based Generative Modeling of Graphs via the System of Stochastic
Differential Equations [57.15855198512551]
We propose a novel score-based generative model for graphs with a continuous-time framework.
We show that our method is able to generate molecules that lie close to the training distribution yet do not violate the chemical valency rule.
arXiv Detail & Related papers (2022-02-05T08:21:04Z) - A data-driven peridynamic continuum model for upscaling molecular
dynamics [3.1196544696082613]
We propose a learning framework to extract, from molecular dynamics data, an optimal Linear Peridynamic Solid model.
We provide sufficient well-posedness conditions for discretized LPS models with sign-changing influence functions.
This framework guarantees that the resulting model is mathematically well-posed, physically consistent, and that it generalizes well to settings that are different from the ones used during training.
arXiv Detail & Related papers (2021-08-04T07:07:47Z) - A Survey of Constrained Gaussian Process Regression: Approaches and
Implementation Challenges [0.0]
We provide an overview of several classes of Gaussian process constraints, including positivity or bound constraints, monotonicity and convexity constraints, differential equation constraints, and boundary condition constraints.
We compare the strategies behind each approach as well as the differences in implementation, concluding with a discussion of the computational challenges introduced by constraints.
arXiv Detail & Related papers (2020-06-16T17:03:36Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.