Mathematics of Digital Twins and Transfer Learning for PDE Models
- URL: http://arxiv.org/abs/2501.06400v1
- Date: Sat, 11 Jan 2025 01:14:15 GMT
- Title: Mathematics of Digital Twins and Transfer Learning for PDE Models
- Authors: Yifei Zong, Alexandre Tartakovsky,
- Abstract summary: We define a digital twin (DT) of a physical system governed by partial differential equations (PDEs)
We construct DTs using the Karhunen-Loeve Neural Network (KL-NN) surrogate model and transfer learning (TL)
- Score: 49.1574468325115
- License:
- Abstract: We define a digital twin (DT) of a physical system governed by partial differential equations (PDEs) as a model for real-time simulations and control of the system behavior under changing conditions. We construct DTs using the Karhunen-Lo\`{e}ve Neural Network (KL-NN) surrogate model and transfer learning (TL). The surrogate model allows fast inference and differentiability with respect to control parameters for control and optimization. TL is used to retrain the model for new conditions with minimal additional data. We employ the moment equations to analyze TL and identify parameters that can be transferred to new conditions. The proposed analysis also guides the control variable selection in DT to facilitate efficient TL. For linear PDE problems, the non-transferable parameters in the KL-NN surrogate model can be exactly estimated from a single solution of the PDE corresponding to the mean values of the control variables under new target conditions. Retraining an ML model with a single solution sample is known as one-shot learning, and our analysis shows that the one-shot TL is exact for linear PDEs. For nonlinear PDE problems, transferring of any parameters introduces errors. For a nonlinear diffusion PDE model, we find that for a relatively small range of control variables, some surrogate model parameters can be transferred without introducing a significant error, some can be approximately estimated from the mean-field equation, and the rest can be found using a linear residual least square problem or an ordinary linear least square problem if a small labeled dataset for new conditions is available. The former approach results in a one-shot TL while the latter approach is an example of a few-shot TL. Both methods are approximate for the nonlinear PDEs.
Related papers
- Self-supervised Pretraining for Partial Differential Equations [0.0]
We describe a novel approach to building a neural PDE solver leveraging recent advances in transformer based neural network architectures.
Our model can provide solutions for different values of PDE parameters without any need for retraining the network.
arXiv Detail & Related papers (2024-07-03T16:39:32Z) - Unisolver: PDE-Conditional Transformers Are Universal PDE Solvers [55.0876373185983]
We present the Universal PDE solver (Unisolver) capable of solving a wide scope of PDEs.
Our key finding is that a PDE solution is fundamentally under the control of a series of PDE components.
Unisolver achieves consistent state-of-the-art results on three challenging large-scale benchmarks.
arXiv Detail & Related papers (2024-05-27T15:34:35Z) - Reduced-order modeling for parameterized PDEs via implicit neural
representations [4.135710717238787]
We present a new data-driven reduced-order modeling approach to efficiently solve parametrized partial differential equations (PDEs)
The proposed framework encodes PDE and utilizes a parametrized neural ODE (PNODE) to learn latent dynamics characterized by multiple PDE parameters.
We evaluate the proposed method at a large Reynolds number and obtain up to speedup of O(103) and 1% relative error to the ground truth values.
arXiv Detail & Related papers (2023-11-28T01:35:06Z) - A Stable and Scalable Method for Solving Initial Value PDEs with Neural
Networks [52.5899851000193]
We develop an ODE based IVP solver which prevents the network from getting ill-conditioned and runs in time linear in the number of parameters.
We show that current methods based on this approach suffer from two key issues.
First, following the ODE produces an uncontrolled growth in the conditioning of the problem, ultimately leading to unacceptably large numerical errors.
arXiv Detail & Related papers (2023-04-28T17:28:18Z) - Neural Control of Parametric Solutions for High-dimensional Evolution
PDEs [6.649496716171139]
We develop a novel computational framework to approximate solution operators of evolution partial differential equations (PDEs)
We propose to approximate the solution operator of the PDE by learning the control vector field in the parameter space.
This allows for substantially reduced computational cost to solve the evolution PDE with arbitrary initial conditions.
arXiv Detail & Related papers (2023-01-31T19:26:25Z) - Meta-PDE: Learning to Solve PDEs Quickly Without a Mesh [24.572840023107574]
Partial differential equations (PDEs) are often computationally challenging to solve.
We present a meta-learning based method which learns to rapidly solve problems from a distribution of related PDEs.
arXiv Detail & Related papers (2022-11-03T06:17:52Z) - Lie Point Symmetry Data Augmentation for Neural PDE Solvers [69.72427135610106]
We present a method, which can partially alleviate this problem, by improving neural PDE solver sample complexity.
In the context of PDEs, it turns out that we are able to quantitatively derive an exhaustive list of data transformations.
We show how it can easily be deployed to improve neural PDE solver sample complexity by an order of magnitude.
arXiv Detail & Related papers (2022-02-15T18:43:17Z) - Message Passing Neural PDE Solvers [60.77761603258397]
We build a neural message passing solver, replacing allally designed components in the graph with backprop-optimized neural function approximators.
We show that neural message passing solvers representationally contain some classical methods, such as finite differences, finite volumes, and WENO schemes.
We validate our method on various fluid-like flow problems, demonstrating fast, stable, and accurate performance across different domain topologies, equation parameters, discretizations, etc., in 1D and 2D.
arXiv Detail & Related papers (2022-02-07T17:47:46Z) - Probabilistic learning on manifolds constrained by nonlinear partial
differential equations for small datasets [0.0]
A novel extension of the Probabilistic Learning on Manifolds (PLoM) is presented.
It makes it possible to synthesize solutions to a wide range of nonlinear boundary value problems.
Three applications are presented.
arXiv Detail & Related papers (2020-10-27T14:34:54Z) - Probabilistic Circuits for Variational Inference in Discrete Graphical
Models [101.28528515775842]
Inference in discrete graphical models with variational methods is difficult.
Many sampling-based methods have been proposed for estimating Evidence Lower Bound (ELBO)
We propose a new approach that leverages the tractability of probabilistic circuit models, such as Sum Product Networks (SPN)
We show that selective-SPNs are suitable as an expressive variational distribution, and prove that when the log-density of the target model is aweighted the corresponding ELBO can be computed analytically.
arXiv Detail & Related papers (2020-10-22T05:04:38Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.