Fast PDE-constrained optimization via self-supervised operator learning
- URL: http://arxiv.org/abs/2110.13297v1
- Date: Mon, 25 Oct 2021 22:17:15 GMT
- Title: Fast PDE-constrained optimization via self-supervised operator learning
- Authors: Sifan Wang, Mohamed Aziz Bhouri, Paris Perdikaris
- Abstract summary: Design and optimal control problems are among the fundamental, ubiquitous tasks we face in science and engineering.
In this work we leverage physics-informed deep operator networks (DeepONets) to build fast and differentiable surrogates for rapidly solving PDE-constrained optimization problems.
DeepONets can minimize high-dimensional cost functionals in a matter of seconds, yielding a significant speed up compared to traditional adjoint PDE solvers.
- Score: 0.0
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Design and optimal control problems are among the fundamental, ubiquitous
tasks we face in science and engineering. In both cases, we aim to represent
and optimize an unknown (black-box) function that associates a
performance/outcome to a set of controllable variables through an experiment.
In cases where the experimental dynamics can be described by partial
differential equations (PDEs), such problems can be mathematically translated
into PDE-constrained optimization tasks, which quickly become intractable as
the number of control variables and the cost of experiments increases. In this
work we leverage physics-informed deep operator networks (DeepONets) -- a
self-supervised framework for learning the solution operator of parametric PDEs
-- to build fast and differentiable surrogates for rapidly solving
PDE-constrained optimization problems, even in the absence of any paired
input-output training data. The effectiveness of the proposed framework will be
demonstrated across different applications involving continuous functions as
control or design variables, including time-dependent optimal control of heat
transfer, and drag minimization of obstacles in Stokes flow. In all cases, we
observe that DeepONets can minimize high-dimensional cost functionals in a
matter of seconds, yielding a significant speed up compared to traditional
adjoint PDE solvers that are typically costly and limited to relatively
low-dimensional control/design parametrizations.
Related papers
- PhysicsCorrect: A Training-Free Approach for Stable Neural PDE Simulations [4.7903561901859355]
We present PhysicsCorrect, a training-free correction framework that enforces PDE consistency at each prediction step.<n>Our key innovation is an efficient caching strategy that precomputes the Jacobian and its pseudoinverse during an offline warm-up phase.<n>Across three representative PDE systems, PhysicsCorrect reduces prediction errors by up to 100x while adding negligible inference time.
arXiv Detail & Related papers (2025-07-03T01:22:57Z) - CodePDE: An Inference Framework for LLM-driven PDE Solver Generation [57.15474515982337]
Partial differential equations (PDEs) are fundamental to modeling physical systems.<n>Traditional numerical solvers rely on expert knowledge to implement and are computationally expensive.<n>We introduce CodePDE, the first inference framework for generating PDE solvers using large language models.
arXiv Detail & Related papers (2025-05-13T17:58:08Z) - PIED: Physics-Informed Experimental Design for Inverse Problems [41.18793004122601]
This work presents PIED, the first framework that makes use of PINNs in a fully differentiable architecture to perform continuous optimization of design parameters for IPs for one-shot deployments.
We empirically show that PIED significantly outperforms existing experimental design (ED) methods in solving inverse problems (IPs) in noisy simulated data and even real world experimental data.
arXiv Detail & Related papers (2025-03-10T08:53:11Z) - Advancing Generalization in PINNs through Latent-Space Representations [71.86401914779019]
Physics-informed neural networks (PINNs) have made significant strides in modeling dynamical systems governed by partial differential equations (PDEs)
We propose PIDO, a novel physics-informed neural PDE solver designed to generalize effectively across diverse PDE configurations.
We validate PIDO on a range of benchmarks, including 1D combined equations and 2D Navier-Stokes equations.
arXiv Detail & Related papers (2024-11-28T13:16:20Z) - Learning a Neural Solver for Parametric PDE to Enhance Physics-Informed Methods [14.791541465418263]
We propose learning a solver, i.e., solving partial differential equations (PDEs) using a physics-informed iterative algorithm trained on data.
Our method learns to condition a gradient descent algorithm that automatically adapts to each PDE instance.
We demonstrate the effectiveness of our method through empirical experiments on multiple datasets.
arXiv Detail & Related papers (2024-10-09T12:28:32Z) - Scaling physics-informed hard constraints with mixture-of-experts [0.0]
We develop a scalable approach to enforce hard physical constraints using Mixture-of-Experts (MoE)
MoE imposes the constraint over smaller domains, each of which is solved by an "expert" through differentiable optimization.
Compared to standard differentiable optimization, our scalable approach achieves greater accuracy in the neural PDE solver setting.
arXiv Detail & Related papers (2024-02-20T22:45:00Z) - NeuralStagger: Accelerating Physics-constrained Neural PDE Solver with
Spatial-temporal Decomposition [67.46012350241969]
This paper proposes a general acceleration methodology called NeuralStagger.
It decomposing the original learning tasks into several coarser-resolution subtasks.
We demonstrate the successful application of NeuralStagger on 2D and 3D fluid dynamics simulations.
arXiv Detail & Related papers (2023-02-20T19:36:52Z) - Semi-supervised Learning of Partial Differential Operators and Dynamical
Flows [68.77595310155365]
We present a novel method that combines a hyper-network solver with a Fourier Neural Operator architecture.
We test our method on various time evolution PDEs, including nonlinear fluid flows in one, two, and three spatial dimensions.
The results show that the new method improves the learning accuracy at the time point of supervision point, and is able to interpolate and the solutions to any intermediate time.
arXiv Detail & Related papers (2022-07-28T19:59:14Z) - Learning differentiable solvers for systems with hard constraints [48.54197776363251]
We introduce a practical method to enforce partial differential equation (PDE) constraints for functions defined by neural networks (NNs)
We develop a differentiable PDE-constrained layer that can be incorporated into any NN architecture.
Our results show that incorporating hard constraints directly into the NN architecture achieves much lower test error when compared to training on an unconstrained objective.
arXiv Detail & Related papers (2022-07-18T15:11:43Z) - Physics-constrained Unsupervised Learning of Partial Differential
Equations using Meshes [1.066048003460524]
Graph neural networks show promise in accurately representing irregularly meshed objects and learning their dynamics.
In this work, we represent meshes naturally as graphs, process these using Graph Networks, and formulate our physics-based loss to provide an unsupervised learning framework for partial differential equations (PDE)
Our framework will enable the application of PDE solvers in interactive settings, such as model-based control of soft-body deformations.
arXiv Detail & Related papers (2022-03-30T19:22:56Z) - Neural Stochastic Dual Dynamic Programming [99.80617899593526]
We introduce a trainable neural model that learns to map problem instances to a piece-wise linear value function.
$nu$-SDDP can significantly reduce problem solving cost without sacrificing solution quality.
arXiv Detail & Related papers (2021-12-01T22:55:23Z) - Solving PDE-constrained Control Problems Using Operator Learning [14.30832827446317]
We introduce surrogate models for PDE solution operators with special regularizers.
Our framework can be applied to both data-driven and data-free cases.
arXiv Detail & Related papers (2021-11-09T03:41:55Z) - Speeding up Computational Morphogenesis with Online Neural Synthetic
Gradients [51.42959998304931]
A wide range of modern science and engineering applications are formulated as optimization problems with a system of partial differential equations (PDEs) as constraints.
These PDE-constrained optimization problems are typically solved in a standard discretize-then-optimize approach.
We propose a general framework to speed up PDE-constrained optimization using online neural synthetic gradients (ONSG) with a novel two-scale optimization scheme.
arXiv Detail & Related papers (2021-04-25T22:43:51Z) - Learning to Control PDEs with Differentiable Physics [102.36050646250871]
We present a novel hierarchical predictor-corrector scheme which enables neural networks to learn to understand and control complex nonlinear physical systems over long time frames.
We demonstrate that our method successfully develops an understanding of complex physical systems and learns to control them for tasks involving PDEs.
arXiv Detail & Related papers (2020-01-21T11:58:41Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.