Single-shot prediction of parametric partial differential equations
- URL: http://arxiv.org/abs/2505.09063v1
- Date: Wed, 14 May 2025 01:48:26 GMT
- Title: Single-shot prediction of parametric partial differential equations
- Authors: Khalid Rafiq, Wenjing Liao, Aditya G. Nair,
- Abstract summary: Flexi-VAE is a data-driven framework for efficient single-shot forecasting of parametric partial differential equations (PDEs)<n>We introduce a neural propagator that advances latent viscous forward in time, aligning latent evolution with physical state reconstruction in a variational autoencoder setting.<n>We validate Flexi-VAE on PDE benchmarks, the 1D Burgers equation and the 2D advection-diffusion equation, achieving accurate forecasts across wide parametric ranges.
- Score: 3.987215131970378
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We introduce Flexi-VAE, a data-driven framework for efficient single-shot forecasting of nonlinear parametric partial differential equations (PDEs), eliminating the need for iterative time-stepping while maintaining high accuracy and stability. Flexi-VAE incorporates a neural propagator that advances latent representations forward in time, aligning latent evolution with physical state reconstruction in a variational autoencoder setting. We evaluate two propagation strategies, the Direct Concatenation Propagator (DCP) and the Positional Encoding Propagator (PEP), and demonstrate, through representation-theoretic analysis, that DCP offers superior long-term generalization by fostering disentangled and physically meaningful latent spaces. Geometric diagnostics, including Jacobian spectral analysis, reveal that propagated latent states reside in regions of lower decoder sensitivity and more stable local geometry than those derived via direct encoding, enhancing robustness for long-horizon predictions. We validate Flexi-VAE on canonical PDE benchmarks, the 1D viscous Burgers equation and the 2D advection-diffusion equation, achieving accurate forecasts across wide parametric ranges. The model delivers over 50x CPU and 90x GPU speedups compared to autoencoder-LSTM baselines for large temporal shifts. These results position Flexi-VAE as a scalable and interpretable surrogate modeling tool for accelerating high-fidelity simulations in computational fluid dynamics (CFD) and other parametric PDE-driven applications, with extensibility to higher-dimensional and more complex systems.
Related papers
- Merging Memory and Space: A Spatiotemporal State Space Neural Operator [2.0104149319910767]
ST-SSM is a compact architecture for learning solution operators of time-dependent partial differential equations.<n>A theoretical connection is established between s and neural operators, and a unified theorem is proved for the resulting class of architectures.<n>Our results highlight the advantages of dimensionally factorized operator learning for efficient and general PDE modeling.
arXiv Detail & Related papers (2025-07-31T11:09:15Z) - Kernel-Adaptive PI-ELMs for Forward and Inverse Problems in PDEs with Sharp Gradients [0.0]
This paper introduces the Kernel Adaptive Physics-Informed Extreme Learning Machine (KAPI-ELM)<n>It is designed to solve both forward and inverse Partial Differential Equation (PDE) problems involving localized sharp gradients.<n>KAPI-ELM achieves state-of-the-art accuracy in both forward and inverse settings.
arXiv Detail & Related papers (2025-07-14T13:03:53Z) - FLEX: A Backbone for Diffusion-Based Modeling of Spatio-temporal Physical Systems [51.15230303652732]
FLEX (F Low EXpert) is a backbone architecture for generative modeling of-temporal physical systems.<n>It reduces the variance of the velocity field in the diffusion model, which helps stabilize training.<n>It achieves accurate predictions for super-resolution and forecasting tasks using as few features as two reverse diffusion steps.
arXiv Detail & Related papers (2025-05-23T00:07:59Z) - Enabling Local Neural Operators to perform Equation-Free System-Level Analysis [0.0]
Neural Operators (NOs) provide a powerful framework for computations involving physical laws.<n>We propose and implement a framework that integrates (local) NOs with advanced iterative numerical methods in the Krylov subspace.<n>We illustrate our framework via three nonlinear PDE benchmarks.
arXiv Detail & Related papers (2025-05-05T01:17:18Z) - Geometry aware inference of steady state PDEs using Equivariant Neural Fields representations [0.0]
We introduce enf2enf, an encoder--decoder methodology for predicting steady-state Partial Differential Equations.<n>Our method supports real time inference and zero-shot super-resolution, enabling efficient training on low-resolution meshes.
arXiv Detail & Related papers (2025-04-24T08:30:32Z) - Sparse identification of nonlinear dynamics and Koopman operators with Shallow Recurrent Decoder Networks [3.1484174280822845]
We present a method to jointly solve the sensing and model identification problems with simple implementation, efficient, and robust performance.<n>SINDy-SHRED uses Gated Recurrent Units to model sparse sensor measurements along with a shallow network decoder to reconstruct the full-temporal field from the latent state space.<n>We conduct systematic experimental studies on PDE data such as turbulent flows, real-world sensor measurements for sea surface temperature, and direct video data.
arXiv Detail & Related papers (2025-01-23T02:18:13Z) - KFD-NeRF: Rethinking Dynamic NeRF with Kalman Filter [49.85369344101118]
We introduce KFD-NeRF, a novel dynamic neural radiance field integrated with an efficient and high-quality motion reconstruction framework based on Kalman filtering.
Our key idea is to model the dynamic radiance field as a dynamic system whose temporally varying states are estimated based on two sources of knowledge: observations and predictions.
Our KFD-NeRF demonstrates similar or even superior performance within comparable computational time and state-of-the-art view synthesis performance with thorough training.
arXiv Detail & Related papers (2024-07-18T05:48:24Z) - Adapting to Length Shift: FlexiLength Network for Trajectory Prediction [53.637837706712794]
Trajectory prediction plays an important role in various applications, including autonomous driving, robotics, and scene understanding.
Existing approaches mainly focus on developing compact neural networks to increase prediction precision on public datasets, typically employing a standardized input duration.
We introduce a general and effective framework, the FlexiLength Network (FLN), to enhance the robustness of existing trajectory prediction against varying observation periods.
arXiv Detail & Related papers (2024-03-31T17:18:57Z) - Generative Modeling with Phase Stochastic Bridges [49.4474628881673]
Diffusion models (DMs) represent state-of-the-art generative models for continuous inputs.
We introduce a novel generative modeling framework grounded in textbfphase space dynamics
Our framework demonstrates the capability to generate realistic data points at an early stage of dynamics propagation.
arXiv Detail & Related papers (2023-10-11T18:38:28Z) - Solving High-Dimensional PDEs with Latent Spectral Models [74.1011309005488]
We present Latent Spectral Models (LSM) toward an efficient and precise solver for high-dimensional PDEs.
Inspired by classical spectral methods in numerical analysis, we design a neural spectral block to solve PDEs in the latent space.
LSM achieves consistent state-of-the-art and yields a relative gain of 11.5% averaged on seven benchmarks.
arXiv Detail & Related papers (2023-01-30T04:58:40Z) - Learning to Accelerate Partial Differential Equations via Latent Global
Evolution [64.72624347511498]
Latent Evolution of PDEs (LE-PDE) is a simple, fast and scalable method to accelerate the simulation and inverse optimization of PDEs.
We introduce new learning objectives to effectively learn such latent dynamics to ensure long-term stability.
We demonstrate up to 128x reduction in the dimensions to update, and up to 15x improvement in speed, while achieving competitive accuracy.
arXiv Detail & Related papers (2022-06-15T17:31:24Z) - gLaSDI: Parametric Physics-informed Greedy Latent Space Dynamics
Identification [0.5249805590164902]
A physics-informed greedy Latent Space Dynamics Identification (gLa) method is proposed for accurate, efficient, and robust data-driven reduced-order modeling.
An interactive training algorithm is adopted for the autoencoder and local DI models, which enables identification of simple latent-space dynamics.
The effectiveness of the proposed framework is demonstrated by modeling various nonlinear dynamical problems.
arXiv Detail & Related papers (2022-04-26T00:15:46Z) - Long-time integration of parametric evolution equations with
physics-informed DeepONets [0.0]
We introduce an effective framework for learning infinite-dimensional operators that map random initial conditions to associated PDE solutions within a short time interval.
Global long-time predictions across a range of initial conditions can be then obtained by iteratively evaluating the trained model.
This introduces a new approach to temporal domain decomposition that is shown to be effective in performing accurate long-time simulations.
arXiv Detail & Related papers (2021-06-09T20:46:17Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.