Space-Time Continuous PDE Forecasting using Equivariant Neural Fields
- URL: http://arxiv.org/abs/2406.06660v1
- Date: Mon, 10 Jun 2024 11:49:11 GMT
- Title: Space-Time Continuous PDE Forecasting using Equivariant Neural Fields
- Authors: David M. Knigge, David R. Wessels, Riccardo Valperga, Samuele Papa, Jan-Jakob Sonke, Efstratios Gavves, Erik J. Bekkers,
- Abstract summary: Conditional Neural Fields (NeFs) have emerged as a powerful modelling paradigm for PDEs.
We propose a space-time continuous NeF-based solving framework that respects known symmetries of the PDE.
We show that modelling solutions as flows of pointclouds over the group of interest $G$ improves generalization and data-efficiency.
- Score: 28.24612886348871
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Recently, Conditional Neural Fields (NeFs) have emerged as a powerful modelling paradigm for PDEs, by learning solutions as flows in the latent space of the Conditional NeF. Although benefiting from favourable properties of NeFs such as grid-agnosticity and space-time-continuous dynamics modelling, this approach limits the ability to impose known constraints of the PDE on the solutions -- e.g. symmetries or boundary conditions -- in favour of modelling flexibility. Instead, we propose a space-time continuous NeF-based solving framework that - by preserving geometric information in the latent space - respects known symmetries of the PDE. We show that modelling solutions as flows of pointclouds over the group of interest $G$ improves generalization and data-efficiency. We validated that our framework readily generalizes to unseen spatial and temporal locations, as well as geometric transformations of the initial conditions - where other NeF-based PDE forecasting methods fail - and improve over baselines in a number of challenging geometries.
Related papers
- Riemannian Langevin Dynamics: Strong Convergence of Geometric Euler-Maruyama Scheme [51.56484100374058]
Low-dimensional structure in real-world data plays an important role in the success of generative models.<n>We prove convergence theory of numerical schemes for manifold-valued differential equations.
arXiv Detail & Related papers (2026-03-04T01:29:35Z) - Structure-Preserving Learning Improves Geometry Generalization in Neural PDEs [7.60216127875876]
We introduce General-Geometry Neural Whitney Forms (Geo-NeW): a data-driven finite element method.<n>We demonstrate state-of-the-art performance on several steady-state PDE benchmarks, and provide a significant improvement over conventional baselines on out-of-distribution geometries.
arXiv Detail & Related papers (2026-02-02T20:45:07Z) - TENG++: Time-Evolving Natural Gradient for Solving PDEs With Deep Neural Nets under General Boundary Conditions [0.5908471365011942]
Partial Differential Equations (PDEs) are central to modeling complex systems across physical, biological, and engineering domains.<n>Traditional numerical methods often struggle with high-dimensional or complex problems.<n>PINNs have emerged as an efficient alternative by embedding physics-based constraints into deep learning frameworks.
arXiv Detail & Related papers (2025-12-13T02:32:45Z) - Physics-Constrained Fine-Tuning of Flow-Matching Models for Generation and Inverse Problems [3.3811247908085855]
We present a framework for fine-tuning flow-matching generative models to enforce physical constraints and solve inverse problems in scientific systems.<n>Our approach bridges generative modelling and scientific inference, opening new avenues for simulation-augmented discovery and data-efficient modelling of physical systems.
arXiv Detail & Related papers (2025-08-05T09:32:04Z) - Geometry aware inference of steady state PDEs using Equivariant Neural Fields representations [0.0]
We introduce enf2enf, an encoder--decoder methodology for predicting steady-state Partial Differential Equations.
Our method supports real time inference and zero-shot super-resolution, enabling efficient training on low-resolution meshes.
arXiv Detail & Related papers (2025-04-24T08:30:32Z) - Advancing Generalization in PINNs through Latent-Space Representations [71.86401914779019]
Physics-informed neural networks (PINNs) have made significant strides in modeling dynamical systems governed by partial differential equations (PDEs)
We propose PIDO, a novel physics-informed neural PDE solver designed to generalize effectively across diverse PDE configurations.
We validate PIDO on a range of benchmarks, including 1D combined equations and 2D Navier-Stokes equations.
arXiv Detail & Related papers (2024-11-28T13:16:20Z) - Vectorized Conditional Neural Fields: A Framework for Solving Time-dependent Parametric Partial Differential Equations [14.052158194490715]
We propose Vectorized Conditional Neural Fields (VCNeFs) to represent the solution of time-dependent PDEs as neural fields.
VCNeFs compute, for a set of multiple-temporal query points, their solutions in parallel and model their complexity.
An extensive set of experiments demonstrates that VCNeFs are competitive with and often outperform existing ML-based surrogate models.
arXiv Detail & Related papers (2024-06-06T10:02:06Z) - Deep Equilibrium Based Neural Operators for Steady-State PDEs [100.88355782126098]
We study the benefits of weight-tied neural network architectures for steady-state PDEs.
We propose FNO-DEQ, a deep equilibrium variant of the FNO architecture that directly solves for the solution of a steady-state PDE.
arXiv Detail & Related papers (2023-11-30T22:34:57Z) - Geometric Neural Diffusion Processes [55.891428654434634]
We extend the framework of diffusion models to incorporate a series of geometric priors in infinite-dimension modelling.
We show that with these conditions, the generative functional model admits the same symmetry.
arXiv Detail & Related papers (2023-07-11T16:51:38Z) - Learning Neural Constitutive Laws From Motion Observations for
Generalizable PDE Dynamics [97.38308257547186]
Many NN approaches learn an end-to-end model that implicitly models both the governing PDE and material models.
We argue that the governing PDEs are often well-known and should be explicitly enforced rather than learned.
We introduce a new framework termed "Neural Constitutive Laws" (NCLaw) which utilizes a network architecture that strictly guarantees standard priors.
arXiv Detail & Related papers (2023-04-27T17:42:24Z) - Continuous PDE Dynamics Forecasting with Implicit Neural Representations [24.460010868042758]
We introduce a new data-driven, approach to PDEs flow with continuous-time dynamics of spatially continuous functions.
This is achieved by embedding spatial extrapolation independently of their discretization via Implicit Neural Representations.
It extrapolates at arbitrary spatial and temporal locations; it can learn sparse grids or irregular data at test time, it generalizes to new grids or resolutions.
arXiv Detail & Related papers (2022-09-29T15:17:50Z) - Fourier Neural Operator with Learned Deformations for PDEs on General Geometries [75.91055304134258]
We propose a new framework, viz., geo-FNO, to solve PDEs on arbitrary geometries.
Geo-FNO learns to deform the input (physical) domain, which may be irregular, into a latent space with a uniform grid.
We consider a variety of PDEs such as the Elasticity, Plasticity, Euler's, and Navier-Stokes equations, and both forward modeling and inverse design problems.
arXiv Detail & Related papers (2022-07-11T21:55:47Z) - Mitigating Learning Complexity in Physics and Equality Constrained
Artificial Neural Networks [0.9137554315375919]
Physics-informed neural networks (PINNs) have been proposed to learn the solution of partial differential equations (PDE)
In PINNs, the residual form of the PDE of interest and its boundary conditions are lumped into a composite objective function as soft penalties.
Here, we show that this specific way of formulating the objective function is the source of severe limitations in the PINN approach when applied to different kinds of PDEs.
arXiv Detail & Related papers (2022-06-19T04:12:01Z) - Learning to Accelerate Partial Differential Equations via Latent Global
Evolution [64.72624347511498]
Latent Evolution of PDEs (LE-PDE) is a simple, fast and scalable method to accelerate the simulation and inverse optimization of PDEs.
We introduce new learning objectives to effectively learn such latent dynamics to ensure long-term stability.
We demonstrate up to 128x reduction in the dimensions to update, and up to 15x improvement in speed, while achieving competitive accuracy.
arXiv Detail & Related papers (2022-06-15T17:31:24Z) - Counting Phases and Faces Using Bayesian Thermodynamic Integration [77.34726150561087]
We introduce a new approach to reconstruction of the thermodynamic functions and phase boundaries in two-parametric statistical mechanics systems.
We use the proposed approach to accurately reconstruct the partition functions and phase diagrams of the Ising model and the exactly solvable non-equilibrium TASEP.
arXiv Detail & Related papers (2022-05-18T17:11:23Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.