TaylorPDENet: Learning PDEs from non-grid Data
- URL: http://arxiv.org/abs/2306.14511v1
- Date: Mon, 26 Jun 2023 08:40:24 GMT
- Title: TaylorPDENet: Learning PDEs from non-grid Data
- Authors: Paul Heinisch, Andrzej Dulny, Anna Krause, Andreas Hotho
- Abstract summary: TaylorPDENet is a novel machine learning method that is designed to overcome this challenge.
Our algorithm uses the multidimensional Taylor expansion of a dynamical system at each observation point to estimate the spatial derivatives to perform predictions.
We evaluate our model on a variety of advection-diffusion equations with different parameters and show that it performs similarly to equivalent approaches on grid-structured data.
- Score: 2.0550543745258283
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Modeling data obtained from dynamical systems has gained attention in recent
years as a challenging task for machine learning models. Previous approaches
assume the measurements to be distributed on a grid. However, for real-world
applications like weather prediction, the observations are taken from arbitrary
locations within the spatial domain. In this paper, we propose TaylorPDENet - a
novel machine learning method that is designed to overcome this challenge. Our
algorithm uses the multidimensional Taylor expansion of a dynamical system at
each observation point to estimate the spatial derivatives to perform
predictions. TaylorPDENet is able to accomplish two objectives simultaneously:
accurately forecast the evolution of a complex dynamical system and explicitly
reconstruct the underlying differential equation describing the system. We
evaluate our model on a variety of advection-diffusion equations with different
parameters and show that it performs similarly to equivalent approaches on
grid-structured data while being able to process unstructured data as well.
Related papers
- Geometric Operator Learning with Optimal Transport [77.16909146519227]
We propose integrating optimal transport (OT) into operator learning for partial differential equations (PDEs) on complex geometries.<n>For 3D simulations focused on surfaces, our OT-based neural operator embeds the surface geometry into a 2D parameterized latent space.<n> Experiments with Reynolds-averaged Navier-Stokes equations (RANS) on the ShapeNet-Car and DrivAerNet-Car datasets show that our method achieves better accuracy and also reduces computational expenses.
arXiv Detail & Related papers (2025-07-26T21:28:25Z) - HyperSINDy: Deep Generative Modeling of Nonlinear Stochastic Governing
Equations [5.279268784803583]
We introduce HyperSINDy, a framework for modeling dynamics via a deep generative model of sparse governing equations from data.
Once trained, HyperSINDy generates dynamics via a differential equation whose coefficients are driven by a white noise.
In experiments, HyperSINDy recovers ground truth governing equations, with learnedity scaling to match that of the data.
arXiv Detail & Related papers (2023-10-07T14:41:59Z) - Score-based Data Assimilation [7.215767098253208]
We introduce score-based data assimilation for trajectory inference.
We learn a score-based generative model of state trajectories based on the key insight that the score of an arbitrarily long trajectory can be decomposed into a series of scores over short segments.
arXiv Detail & Related papers (2023-06-18T14:22:03Z) - Capturing dynamical correlations using implicit neural representations [85.66456606776552]
We develop an artificial intelligence framework which combines a neural network trained to mimic simulated data from a model Hamiltonian with automatic differentiation to recover unknown parameters from experimental data.
In doing so, we illustrate the ability to build and train a differentiable model only once, which then can be applied in real-time to multi-dimensional scattering data.
arXiv Detail & Related papers (2023-04-08T07:55:36Z) - VTAE: Variational Transformer Autoencoder with Manifolds Learning [144.0546653941249]
Deep generative models have demonstrated successful applications in learning non-linear data distributions through a number of latent variables.
The nonlinearity of the generator implies that the latent space shows an unsatisfactory projection of the data space, which results in poor representation learning.
We show that geodesics and accurate computation can substantially improve the performance of deep generative models.
arXiv Detail & Related papers (2023-04-03T13:13:19Z) - Capturing Actionable Dynamics with Structured Latent Ordinary
Differential Equations [68.62843292346813]
We propose a structured latent ODE model that captures system input variations within its latent representation.
Building on a static variable specification, our model learns factors of variation for each input to the system, thus separating the effects of the system inputs in the latent space.
arXiv Detail & Related papers (2022-02-25T20:00:56Z) - Equivariant vector field network for many-body system modeling [65.22203086172019]
Equivariant Vector Field Network (EVFN) is built on a novel equivariant basis and the associated scalarization and vectorization layers.
We evaluate our method on predicting trajectories of simulated Newton mechanics systems with both full and partially observed data.
arXiv Detail & Related papers (2021-10-26T14:26:25Z) - Modeling Systems with Machine Learning based Differential Equations [0.0]
We propose the design of time-continuous models of dynamical systems as solutions of differential equations.
Our results suggest that this approach can be an useful technique in the case of synthetic or experimental data.
arXiv Detail & Related papers (2021-09-09T19:10:46Z) - Using Data Assimilation to Train a Hybrid Forecast System that Combines
Machine-Learning and Knowledge-Based Components [52.77024349608834]
We consider the problem of data-assisted forecasting of chaotic dynamical systems when the available data is noisy partial measurements.
We show that by using partial measurements of the state of the dynamical system, we can train a machine learning model to improve predictions made by an imperfect knowledge-based model.
arXiv Detail & Related papers (2021-02-15T19:56:48Z) - A Deep Learning Approach for Predicting Spatiotemporal Dynamics From
Sparsely Observed Data [10.217447098102165]
We consider the problem of learning prediction models for physical processes driven by unknown partial differential equations (PDEs)
We propose a deep learning framework that learns the underlying dynamics and predicts its evolution using sparsely distributed data sites.
arXiv Detail & Related papers (2020-11-30T16:38:00Z) - Stochastically forced ensemble dynamic mode decomposition for
forecasting and analysis of near-periodic systems [65.44033635330604]
We introduce a novel load forecasting method in which observed dynamics are modeled as a forced linear system.
We show that its use of intrinsic linear dynamics offers a number of desirable properties in terms of interpretability and parsimony.
Results are presented for a test case using load data from an electrical grid.
arXiv Detail & Related papers (2020-10-08T20:25:52Z) - Learning continuous-time PDEs from sparse data with graph neural
networks [10.259254824702555]
We propose a continuous-time differential model for dynamical systems whose governing equations are parameterized by message passing graph neural networks.
We demonstrate the model's ability to work with unstructured grids, arbitrary time steps, and noisy observations.
We compare our method with existing approaches on several well-known physical systems that involve first and higher-order PDEs with state-of-the-art predictive performance.
arXiv Detail & Related papers (2020-06-16T07:15:40Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.