Vectorized Conditional Neural Fields: A Framework for Solving Time-dependent Parametric Partial Differential Equations
- URL: http://arxiv.org/abs/2406.03919v2
- Date: Sat, 13 Jul 2024 12:32:25 GMT
- Title: Vectorized Conditional Neural Fields: A Framework for Solving Time-dependent Parametric Partial Differential Equations
- Authors: Jan Hagnberger, Marimuthu Kalimuthu, Daniel Musekamp, Mathias Niepert,
- Abstract summary: We propose Vectorized Conditional Neural Fields (VCNeFs) to represent the solution of time-dependent PDEs as neural fields.
VCNeFs compute, for a set of multiple-temporal query points, their solutions in parallel and model their complexity.
An extensive set of experiments demonstrates that VCNeFs are competitive with and often outperform existing ML-based surrogate models.
- Score: 14.052158194490715
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Transformer models are increasingly used for solving Partial Differential Equations (PDEs). Several adaptations have been proposed, all of which suffer from the typical problems of Transformers, such as quadratic memory and time complexity. Furthermore, all prevalent architectures for PDE solving lack at least one of several desirable properties of an ideal surrogate model, such as (i) generalization to PDE parameters not seen during training, (ii) spatial and temporal zero-shot super-resolution, (iii) continuous temporal extrapolation, (iv) support for 1D, 2D, and 3D PDEs, and (v) efficient inference for longer temporal rollouts. To address these limitations, we propose Vectorized Conditional Neural Fields (VCNeFs), which represent the solution of time-dependent PDEs as neural fields. Contrary to prior methods, however, VCNeFs compute, for a set of multiple spatio-temporal query points, their solutions in parallel and model their dependencies through attention mechanisms. Moreover, VCNeF can condition the neural field on both the initial conditions and the parameters of the PDEs. An extensive set of experiments demonstrates that VCNeFs are competitive with and often outperform existing ML-based surrogate models.
Related papers
- Trajectory Flow Matching with Applications to Clinical Time Series Modeling [77.58277281319253]
Trajectory Flow Matching (TFM) trains a Neural SDE in a simulation-free manner, bypassing backpropagation through the dynamics.
We demonstrate improved performance on three clinical time series datasets in terms of absolute performance and uncertainty prediction.
arXiv Detail & Related papers (2024-10-28T15:54:50Z) - Unisolver: PDE-Conditional Transformers Are Universal PDE Solvers [55.0876373185983]
We present the Universal PDE solver (Unisolver) capable of solving a wide scope of PDEs.
Our key finding is that a PDE solution is fundamentally under the control of a series of PDE components.
Unisolver achieves consistent state-of-the-art results on three challenging large-scale benchmarks.
arXiv Detail & Related papers (2024-05-27T15:34:35Z) - Reduced-order modeling for parameterized PDEs via implicit neural
representations [4.135710717238787]
We present a new data-driven reduced-order modeling approach to efficiently solve parametrized partial differential equations (PDEs)
The proposed framework encodes PDE and utilizes a parametrized neural ODE (PNODE) to learn latent dynamics characterized by multiple PDE parameters.
We evaluate the proposed method at a large Reynolds number and obtain up to speedup of O(103) and 1% relative error to the ground truth values.
arXiv Detail & Related papers (2023-11-28T01:35:06Z) - Time and State Dependent Neural Delay Differential Equations [0.5249805590164901]
Delayed terms are encountered in the governing equations of a large class of problems ranging from physics and engineering to medicine and economics.
We introduce Neural State-Dependent DDE, a framework that can model multiple and state- and time-dependent delays.
We show that our method is competitive and outperforms other continuous-class models on a wide variety of delayed dynamical systems.
arXiv Detail & Related papers (2023-06-26T09:35:56Z) - A Stable and Scalable Method for Solving Initial Value PDEs with Neural
Networks [52.5899851000193]
We develop an ODE based IVP solver which prevents the network from getting ill-conditioned and runs in time linear in the number of parameters.
We show that current methods based on this approach suffer from two key issues.
First, following the ODE produces an uncontrolled growth in the conditioning of the problem, ultimately leading to unacceptably large numerical errors.
arXiv Detail & Related papers (2023-04-28T17:28:18Z) - Fully probabilistic deep models for forward and inverse problems in
parametric PDEs [1.9599274203282304]
We introduce a physics-driven deep latent variable model (PDDLVM) to learn simultaneously parameter-to-solution (forward) and solution-to- parameter (inverse) maps of PDEs.
The proposed framework can be easily extended to seamlessly integrate observed data to solve inverse problems and to build generative models.
We demonstrate the efficiency and robustness of our method on finite element discretized parametric PDE problems.
arXiv Detail & Related papers (2022-08-09T15:40:53Z) - Semi-supervised Learning of Partial Differential Operators and Dynamical
Flows [68.77595310155365]
We present a novel method that combines a hyper-network solver with a Fourier Neural Operator architecture.
We test our method on various time evolution PDEs, including nonlinear fluid flows in one, two, and three spatial dimensions.
The results show that the new method improves the learning accuracy at the time point of supervision point, and is able to interpolate and the solutions to any intermediate time.
arXiv Detail & Related papers (2022-07-28T19:59:14Z) - Learning to Accelerate Partial Differential Equations via Latent Global
Evolution [64.72624347511498]
Latent Evolution of PDEs (LE-PDE) is a simple, fast and scalable method to accelerate the simulation and inverse optimization of PDEs.
We introduce new learning objectives to effectively learn such latent dynamics to ensure long-term stability.
We demonstrate up to 128x reduction in the dimensions to update, and up to 15x improvement in speed, while achieving competitive accuracy.
arXiv Detail & Related papers (2022-06-15T17:31:24Z) - PAGP: A physics-assisted Gaussian process framework with active learning
for forward and inverse problems of partial differential equations [12.826754199680474]
We introduce three different models: continuous time, discrete time and hybrid models.
The given physical information is integrated into Gaussian process model through our designed GP loss functions.
In the last part, a novel hybrid model combining the continuous and discrete time models is presented.
arXiv Detail & Related papers (2022-04-06T05:08:01Z) - Message Passing Neural PDE Solvers [60.77761603258397]
We build a neural message passing solver, replacing allally designed components in the graph with backprop-optimized neural function approximators.
We show that neural message passing solvers representationally contain some classical methods, such as finite differences, finite volumes, and WENO schemes.
We validate our method on various fluid-like flow problems, demonstrating fast, stable, and accurate performance across different domain topologies, equation parameters, discretizations, etc., in 1D and 2D.
arXiv Detail & Related papers (2022-02-07T17:47:46Z) - Semi-Implicit Neural Solver for Time-dependent Partial Differential
Equations [4.246966726709308]
We propose a neural solver to learn an optimal iterative scheme in a data-driven fashion for any class of PDEs.
We provide theoretical guarantees for the correctness and convergence of neural solvers analogous to conventional iterative solvers.
arXiv Detail & Related papers (2021-09-03T12:03:10Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.