A Spectral Approach for Learning Spatiotemporal Neural Differential
Equations
- URL: http://arxiv.org/abs/2309.16131v1
- Date: Thu, 28 Sep 2023 03:22:49 GMT
- Title: A Spectral Approach for Learning Spatiotemporal Neural Differential
Equations
- Authors: Mingtao Xia, Xiangting Li, Qijing Shen, Tom Chou
- Abstract summary: We propose a neural-ODE based method that uses spectral expansions in space to learn unbounded differential equations.
By developing a spectral framework for learning both PDEs and integro-differential equations we extend machine learning methods to apply to DEs and a larger class of problems.
- Score: 0.0
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Rapidly developing machine learning methods has stimulated research interest
in computationally reconstructing differential equations (DEs) from
observational data which may provide additional insight into underlying
causative mechanisms. In this paper, we propose a novel neural-ODE based method
that uses spectral expansions in space to learn spatiotemporal DEs. The major
advantage of our spectral neural DE learning approach is that it does not rely
on spatial discretization, thus allowing the target spatiotemporal equations to
contain long range, nonlocal spatial interactions that act on unbounded spatial
domains. Our spectral approach is shown to be as accurate as some of the latest
machine learning approaches for learning PDEs operating on bounded domains. By
developing a spectral framework for learning both PDEs and integro-differential
equations, we extend machine learning methods to apply to unbounded DEs and a
larger class of problems.
Related papers
- Neural Spectral Methods: Self-supervised learning in the spectral domain [0.0]
We present Neural Spectral Methods, a technique to solve parametric Partial Equations (PDEs)
Our method uses bases to learn PDE solutions as mappings between spectral coefficients.
Our experimental results demonstrate that our method significantly outperforms previous machine learning approaches terms of speed and accuracy.
arXiv Detail & Related papers (2023-12-08T18:20:43Z) - Spatio-Temporal Branching for Motion Prediction using Motion Increments [55.68088298632865]
Human motion prediction (HMP) has emerged as a popular research topic due to its diverse applications.
Traditional methods rely on hand-crafted features and machine learning techniques.
We propose a noveltemporal-temporal branching network using incremental information for HMP.
arXiv Detail & Related papers (2023-08-02T12:04:28Z) - Solving High-Dimensional PDEs with Latent Spectral Models [74.1011309005488]
We present Latent Spectral Models (LSM) toward an efficient and precise solver for high-dimensional PDEs.
Inspired by classical spectral methods in numerical analysis, we design a neural spectral block to solve PDEs in the latent space.
LSM achieves consistent state-of-the-art and yields a relative gain of 11.5% averaged on seven benchmarks.
arXiv Detail & Related papers (2023-01-30T04:58:40Z) - Semi-supervised Learning of Partial Differential Operators and Dynamical
Flows [68.77595310155365]
We present a novel method that combines a hyper-network solver with a Fourier Neural Operator architecture.
We test our method on various time evolution PDEs, including nonlinear fluid flows in one, two, and three spatial dimensions.
The results show that the new method improves the learning accuracy at the time point of supervision point, and is able to interpolate and the solutions to any intermediate time.
arXiv Detail & Related papers (2022-07-28T19:59:14Z) - Unsupervised Legendre-Galerkin Neural Network for Stiff Partial
Differential Equations [9.659504024299896]
We propose an unsupervised machine learning algorithm based on the Legendre-Galerkin neural network to find an accurate approximation to the solution of different types of PDEs.
The proposed neural network is applied to the general 1D and 2D PDEs as well as singularly perturbed PDEs that possess boundary layer behavior.
arXiv Detail & Related papers (2022-07-21T00:47:47Z) - Neural Laplace: Learning diverse classes of differential equations in
the Laplace domain [86.52703093858631]
We propose a unified framework for learning diverse classes of differential equations (DEs) including all the aforementioned ones.
Instead of modelling the dynamics in the time domain, we model it in the Laplace domain, where the history-dependencies and discontinuities in time can be represented as summations of complex exponentials.
In the experiments, Neural Laplace shows superior performance in modelling and extrapolating the trajectories of diverse classes of DEs.
arXiv Detail & Related papers (2022-06-10T02:14:59Z) - Investigation of Nonlinear Model Order Reduction of the Quasigeostrophic
Equations through a Physics-Informed Convolutional Autoencoder [0.0]
Reduced order modeling (ROM) approximates complex physics-based models of real-world processes by inexpensive surrogates.
In this paper we explore the construction of ROM using autoencoders (AE) that perform nonlinear projections of the system dynamics onto a low dimensional manifold.
Our investigation using the quasi-geostrophic equations reveals that while the PI cost function helps with spatial reconstruction, spatial features are less powerful than spectral features.
arXiv Detail & Related papers (2021-08-27T15:20:01Z) - Optimal oracle inequalities for solving projected fixed-point equations [53.31620399640334]
We study methods that use a collection of random observations to compute approximate solutions by searching over a known low-dimensional subspace of the Hilbert space.
We show how our results precisely characterize the error of a class of temporal difference learning methods for the policy evaluation problem with linear function approximation.
arXiv Detail & Related papers (2020-12-09T20:19:32Z) - Large-scale Neural Solvers for Partial Differential Equations [48.7576911714538]
Solving partial differential equations (PDE) is an indispensable part of many branches of science as many processes can be modelled in terms of PDEs.
Recent numerical solvers require manual discretization of the underlying equation as well as sophisticated, tailored code for distributed computing.
We examine the applicability of continuous, mesh-free neural solvers for partial differential equations, physics-informed neural networks (PINNs)
We discuss the accuracy of GatedPINN with respect to analytical solutions -- as well as state-of-the-art numerical solvers, such as spectral solvers.
arXiv Detail & Related papers (2020-09-08T13:26:51Z) - PDE-Driven Spatiotemporal Disentanglement [0.0]
A recent line of work in the machine learning community addresses the problem of predicting high-dimensional phenomena by leveraging tools specific from the differential equations theory.
We propose a novel and general paradigm for this task based on a method for partial differential equations: separation of variables.
We experimentally demonstrate the performance and broad applicability of our method against prior state-of-the-art models on physical and synthetic video datasets.
arXiv Detail & Related papers (2020-08-04T06:10:30Z) - Learning continuous-time PDEs from sparse data with graph neural
networks [10.259254824702555]
We propose a continuous-time differential model for dynamical systems whose governing equations are parameterized by message passing graph neural networks.
We demonstrate the model's ability to work with unstructured grids, arbitrary time steps, and noisy observations.
We compare our method with existing approaches on several well-known physical systems that involve first and higher-order PDEs with state-of-the-art predictive performance.
arXiv Detail & Related papers (2020-06-16T07:15:40Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.