GEPS: Boosting Generalization in Parametric PDE Neural Solvers through Adaptive Conditioning
- URL: http://arxiv.org/abs/2410.23889v1
- Date: Thu, 31 Oct 2024 12:51:40 GMT
- Title: GEPS: Boosting Generalization in Parametric PDE Neural Solvers through Adaptive Conditioning
- Authors: Armand Kassaï Koupaï, Jorge Misfut Benet, Yuan Yin, Jean-Noël Vittaut, Patrick Gallinari,
- Abstract summary: Data-driven approaches learn parametric PDEs by incorporating a very large variety of trajectories with varying PDE parameters.
GEPS is a simple adaptation mechanism to boost GEneralization in Pde solvers.
We demonstrate the versatility of our approach for both fully data-driven and for physics-aware neural solvers.
- Score: 14.939978372699084
- License:
- Abstract: Solving parametric partial differential equations (PDEs) presents significant challenges for data-driven methods due to the sensitivity of spatio-temporal dynamics to variations in PDE parameters. Machine learning approaches often struggle to capture this variability. To address this, data-driven approaches learn parametric PDEs by sampling a very large variety of trajectories with varying PDE parameters. We first show that incorporating conditioning mechanisms for learning parametric PDEs is essential and that among them, $\textit{adaptive conditioning}$, allows stronger generalization. As existing adaptive conditioning methods do not scale well with respect to the number of parameters to adapt in the neural solver, we propose GEPS, a simple adaptation mechanism to boost GEneralization in Pde Solvers via a first-order optimization and low-rank rapid adaptation of a small set of context parameters. We demonstrate the versatility of our approach for both fully data-driven and for physics-aware neural solvers. Validation performed on a whole range of spatio-temporal forecasting problems demonstrates excellent performance for generalizing to unseen conditions including initial conditions, PDE coefficients, forcing terms and solution domain. $\textit{Project page}$: https://geps-project.github.io
Related papers
- Learning a Neural Solver for Parametric PDE to Enhance Physics-Informed Methods [14.791541465418263]
We propose learning a solver, i.e., solving partial differential equations (PDEs) using a physics-informed iterative algorithm trained on data.
Our method learns to condition a gradient descent algorithm that automatically adapts to each PDE instance.
We demonstrate the effectiveness of our method through empirical experiments on multiple datasets.
arXiv Detail & Related papers (2024-10-09T12:28:32Z) - Parameterized Physics-informed Neural Networks for Parameterized PDEs [24.926311700375948]
In this paper, we propose a novel extension, parameterized physics-informed neural networks (PINNs)
PINNs enable modeling the solutions of parameterized partial differential equations (PDEs) via explicitly encoding a latent representation of PDE parameters.
We demonstrate that P$2$INNs outperform the baselines both in accuracy and parameter efficiency on benchmark 1D and 2D parameterized PDEs.
arXiv Detail & Related papers (2024-08-18T11:58:22Z) - Self-supervised Pretraining for Partial Differential Equations [0.0]
We describe a novel approach to building a neural PDE solver leveraging recent advances in transformer based neural network architectures.
Our model can provide solutions for different values of PDE parameters without any need for retraining the network.
arXiv Detail & Related papers (2024-07-03T16:39:32Z) - Adaptive Preference Scaling for Reinforcement Learning with Human Feedback [103.36048042664768]
Reinforcement learning from human feedback (RLHF) is a prevalent approach to align AI systems with human values.
We propose a novel adaptive preference loss, underpinned by distributionally robust optimization (DRO)
Our method is versatile and can be readily adapted to various preference optimization frameworks.
arXiv Detail & Related papers (2024-06-04T20:33:22Z) - ETHER: Efficient Finetuning of Large-Scale Models with Hyperplane Reflections [59.839926875976225]
We propose the ETHER transformation family, which performs Efficient fineTuning via HypErplane Reflections.
In particular, we introduce ETHER and its relaxation ETHER+, which match or outperform existing PEFT methods with significantly fewer parameters.
arXiv Detail & Related papers (2024-05-30T17:26:02Z) - Unisolver: PDE-Conditional Transformers Are Universal PDE Solvers [55.0876373185983]
We present the Universal PDE solver (Unisolver) capable of solving a wide scope of PDEs.
Our key finding is that a PDE solution is fundamentally under the control of a series of PDE components.
Unisolver achieves consistent state-of-the-art results on three challenging large-scale benchmarks.
arXiv Detail & Related papers (2024-05-27T15:34:35Z) - A Stable and Scalable Method for Solving Initial Value PDEs with Neural
Networks [52.5899851000193]
We develop an ODE based IVP solver which prevents the network from getting ill-conditioned and runs in time linear in the number of parameters.
We show that current methods based on this approach suffer from two key issues.
First, following the ODE produces an uncontrolled growth in the conditioning of the problem, ultimately leading to unacceptably large numerical errors.
arXiv Detail & Related papers (2023-04-28T17:28:18Z) - Learning Neural PDE Solvers with Parameter-Guided Channel Attention [17.004380150146268]
In application domains such as weather forecasting, molecular dynamics, and inverse design, ML-based surrogate models are increasingly used.
We propose a Channel Attention Embeddings (CAPE) component for neural surrogate models and a simple yet effective curriculum learning strategy.
The CAPE module can be combined with neural PDE solvers allowing them to adapt to unseen PDE parameters.
arXiv Detail & Related papers (2023-04-27T12:05:34Z) - Neural Control of Parametric Solutions for High-dimensional Evolution
PDEs [6.649496716171139]
We develop a novel computational framework to approximate solution operators of evolution partial differential equations (PDEs)
We propose to approximate the solution operator of the PDE by learning the control vector field in the parameter space.
This allows for substantially reduced computational cost to solve the evolution PDE with arbitrary initial conditions.
arXiv Detail & Related papers (2023-01-31T19:26:25Z) - Lie Point Symmetry Data Augmentation for Neural PDE Solvers [69.72427135610106]
We present a method, which can partially alleviate this problem, by improving neural PDE solver sample complexity.
In the context of PDEs, it turns out that we are able to quantitatively derive an exhaustive list of data transformations.
We show how it can easily be deployed to improve neural PDE solver sample complexity by an order of magnitude.
arXiv Detail & Related papers (2022-02-15T18:43:17Z) - Message Passing Neural PDE Solvers [60.77761603258397]
We build a neural message passing solver, replacing allally designed components in the graph with backprop-optimized neural function approximators.
We show that neural message passing solvers representationally contain some classical methods, such as finite differences, finite volumes, and WENO schemes.
We validate our method on various fluid-like flow problems, demonstrating fast, stable, and accurate performance across different domain topologies, equation parameters, discretizations, etc., in 1D and 2D.
arXiv Detail & Related papers (2022-02-07T17:47:46Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.