Continuous PDE Dynamics Forecasting with Implicit Neural Representations
- URL: http://arxiv.org/abs/2209.14855v1
- Date: Thu, 29 Sep 2022 15:17:50 GMT
- Title: Continuous PDE Dynamics Forecasting with Implicit Neural Representations
- Authors: Yuan Yin, Matthieu Kirchmeyer, Jean-Yves Franceschi, Alain
Rakotomamonjy, Patrick Gallinari
- Abstract summary: We introduce a new data-driven, approach to PDEs flow with continuous-time dynamics of spatially continuous functions.
This is achieved by embedding spatial extrapolation independently of their discretization via Implicit Neural Representations.
It extrapolates at arbitrary spatial and temporal locations; it can learn sparse grids or irregular data at test time, it generalizes to new grids or resolutions.
- Score: 24.460010868042758
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Effective data-driven PDE forecasting methods often rely on fixed spatial and
/ or temporal discretizations. This raises limitations in real-world
applications like weather prediction where flexible extrapolation at arbitrary
spatiotemporal locations is required. We address this problem by introducing a
new data-driven approach, DINo, that models a PDE's flow with continuous-time
dynamics of spatially continuous functions. This is achieved by embedding
spatial observations independently of their discretization via Implicit Neural
Representations in a small latent space temporally driven by a learned ODE.
This separate and flexible treatment of time and space makes DINo the first
data-driven model to combine the following advantages. It extrapolates at
arbitrary spatial and temporal locations; it can learn from sparse irregular
grids or manifolds; at test time, it generalizes to new grids or resolutions.
DINo outperforms alternative neural PDE forecasters in a variety of challenging
generalization scenarios on representative PDE systems.
Related papers
- PDETime: Rethinking Long-Term Multivariate Time Series Forecasting from
the perspective of partial differential equations [49.80959046861793]
We present PDETime, a novel LMTF model inspired by the principles of Neural PDE solvers.
Our experimentation across seven diversetemporal real-world LMTF datasets reveals that PDETime adapts effectively to the intrinsic nature of the data.
arXiv Detail & Related papers (2024-02-25T17:39:44Z) - Implicit Neural Spatial Representations for Time-dependent PDEs [29.404161110513616]
Implicit Neural Spatial Representation (INSR) has emerged as an effective representation of spatially-dependent vector fields.
This work explores solving time-dependent PDEs with INSR.
arXiv Detail & Related papers (2022-09-30T22:46:40Z) - STONet: A Neural-Operator-Driven Spatio-temporal Network [38.5696882090282]
Graph-based graph-temporal neural networks are effective to model spatial dependency among discrete points sampled irregularly.
We propose atemporal framework based on neural operators for PDEs, which learn the mechanisms governing the dynamics of spatially-continuous physical quantities.
Experiments show our model's performance on forecasting spatially-continuous physic quantities, and its superior to unseen spatial points and ability to handle temporally-irregular data.
arXiv Detail & Related papers (2022-04-18T17:20:12Z) - Unraveled Multilevel Transformation Networks for Predicting
Sparsely-Observed Spatiotemporal Dynamics [12.627823168264209]
We propose a model that learns to predict unknown dynamics using data from sparsely-distributed data sites.
We demonstrate the advantage of our approach using both synthetic and real-world climate data.
arXiv Detail & Related papers (2022-03-16T14:44:05Z) - Averaging Spatio-temporal Signals using Optimal Transport and Soft
Alignments [110.79706180350507]
We show that our proposed loss can be used to define temporal-temporal baryechecenters as Fr'teche means duality.
Experiments on handwritten letters and brain imaging data confirm our theoretical findings.
arXiv Detail & Related papers (2022-03-11T09:46:22Z) - Multivariate Time Series Forecasting with Dynamic Graph Neural ODEs [65.18780403244178]
We propose a continuous model to forecast Multivariate Time series with dynamic Graph neural Ordinary Differential Equations (MTGODE)
Specifically, we first abstract multivariate time series into dynamic graphs with time-evolving node features and unknown graph structures.
Then, we design and solve a neural ODE to complement missing graph topologies and unify both spatial and temporal message passing.
arXiv Detail & Related papers (2022-02-17T02:17:31Z) - Neural ODE Processes [64.10282200111983]
We introduce Neural ODE Processes (NDPs), a new class of processes determined by a distribution over Neural ODEs.
We show that our model can successfully capture the dynamics of low-dimensional systems from just a few data-points.
arXiv Detail & Related papers (2021-03-23T09:32:06Z) - Stochastically forced ensemble dynamic mode decomposition for
forecasting and analysis of near-periodic systems [65.44033635330604]
We introduce a novel load forecasting method in which observed dynamics are modeled as a forced linear system.
We show that its use of intrinsic linear dynamics offers a number of desirable properties in terms of interpretability and parsimony.
Results are presented for a test case using load data from an electrical grid.
arXiv Detail & Related papers (2020-10-08T20:25:52Z) - Learning continuous-time PDEs from sparse data with graph neural
networks [10.259254824702555]
We propose a continuous-time differential model for dynamical systems whose governing equations are parameterized by message passing graph neural networks.
We demonstrate the model's ability to work with unstructured grids, arbitrary time steps, and noisy observations.
We compare our method with existing approaches on several well-known physical systems that involve first and higher-order PDEs with state-of-the-art predictive performance.
arXiv Detail & Related papers (2020-06-16T07:15:40Z) - A Spatial-Temporal Attentive Network with Spatial Continuity for
Trajectory Prediction [74.00750936752418]
We propose a novel model named spatial-temporal attentive network with spatial continuity (STAN-SC)
First, spatial-temporal attention mechanism is presented to explore the most useful and important information.
Second, we conduct a joint feature sequence based on the sequence and instant state information to make the generative trajectories keep spatial continuity.
arXiv Detail & Related papers (2020-03-13T04:35:50Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.