Operator Learning for Continuous Spatial-Temporal Model with
Gradient-Based and Derivative-Free Optimization Methods
- URL: http://arxiv.org/abs/2311.11798v2
- Date: Fri, 12 Jan 2024 23:49:45 GMT
- Title: Operator Learning for Continuous Spatial-Temporal Model with
Gradient-Based and Derivative-Free Optimization Methods
- Authors: Chuanqi Chen, Jin-Long Wu
- Abstract summary: We build on the recent progress of operator learning and present a data-driven modeling framework that is continuous in both space and time.
A key feature of the proposed model is the resolution-invariance with respect to both spatial and temporal discretizations.
We show that the proposed model can better predict long-term statistics via the hybrid optimization scheme.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Partial differential equations are often used in the spatial-temporal
modeling of complex dynamical systems in many engineering applications. In this
work, we build on the recent progress of operator learning and present a
data-driven modeling framework that is continuous in both space and time. A key
feature of the proposed model is the resolution-invariance with respect to both
spatial and temporal discretizations, without demanding abundant training data
in different temporal resolutions. To improve the long-term performance of the
calibrated model, we further propose a hybrid optimization scheme that
leverages both gradient-based and derivative-free optimization methods and
efficiently trains on both short-term time series and long-term statistics. We
investigate the performance of the spatial-temporal continuous learning
framework with three numerical examples, including the viscous Burgers'
equation, the Navier-Stokes equations, and the Kuramoto-Sivashinsky equation.
The results confirm the resolution-invariance of the proposed modeling
framework and also demonstrate stable long-term simulations with only
short-term time series data. In addition, we show that the proposed model can
better predict long-term statistics via the hybrid optimization scheme with a
combined use of short-term and long-term data.
Related papers
- On the Trajectory Regularity of ODE-based Diffusion Sampling [79.17334230868693]
Diffusion-based generative models use differential equations to establish a smooth connection between a complex data distribution and a tractable prior distribution.
In this paper, we identify several intriguing trajectory properties in the ODE-based sampling process of diffusion models.
arXiv Detail & Related papers (2024-05-18T15:59:41Z) - Tensor-Valued Time and Inference Path Optimization in Differential Equation-Based Generative Modeling [16.874769609089764]
This work introduces, for the first time, a tensor-valued time that expands the conventional scalar-valued time into multiple dimensions.
We also propose a novel path optimization problem designed to adaptively determine multidimensional inference trajectories.
arXiv Detail & Related papers (2024-04-22T13:20:01Z) - PDETime: Rethinking Long-Term Multivariate Time Series Forecasting from
the perspective of partial differential equations [49.80959046861793]
We present PDETime, a novel LMTF model inspired by the principles of Neural PDE solvers.
Our experimentation across seven diversetemporal real-world LMTF datasets reveals that PDETime adapts effectively to the intrinsic nature of the data.
arXiv Detail & Related papers (2024-02-25T17:39:44Z) - Learning Space-Time Continuous Neural PDEs from Partially Observed
States [13.01244901400942]
We introduce a grid-independent model learning partial differential equations (PDEs) from noisy and partial observations on irregular grids.
We propose a space-time continuous latent neural PDE model with an efficient probabilistic framework and a novel design encoder for improved data efficiency and grid independence.
arXiv Detail & Related papers (2023-07-09T06:53:59Z) - Learning PDE Solution Operator for Continuous Modeling of Time-Series [1.39661494747879]
This work presents a partial differential equation (PDE) based framework which improves the dynamics modeling capability.
We propose a neural operator that can handle time continuously without requiring iterative operations or specific grids of temporal discretization.
Our framework opens up a new way for a continuous representation of neural networks that can be readily adopted for real-world applications.
arXiv Detail & Related papers (2023-02-02T03:47:52Z) - Semi-supervised Learning of Partial Differential Operators and Dynamical
Flows [68.77595310155365]
We present a novel method that combines a hyper-network solver with a Fourier Neural Operator architecture.
We test our method on various time evolution PDEs, including nonlinear fluid flows in one, two, and three spatial dimensions.
The results show that the new method improves the learning accuracy at the time point of supervision point, and is able to interpolate and the solutions to any intermediate time.
arXiv Detail & Related papers (2022-07-28T19:59:14Z) - Multivariate Time Series Forecasting with Dynamic Graph Neural ODEs [65.18780403244178]
We propose a continuous model to forecast Multivariate Time series with dynamic Graph neural Ordinary Differential Equations (MTGODE)
Specifically, we first abstract multivariate time series into dynamic graphs with time-evolving node features and unknown graph structures.
Then, we design and solve a neural ODE to complement missing graph topologies and unify both spatial and temporal message passing.
arXiv Detail & Related papers (2022-02-17T02:17:31Z) - Time Series Forecasting Using Manifold Learning [6.316185724124034]
We address a three-tier numerical framework based on manifold learning for the forecasting of high-dimensional time series.
At the first step, we embed the time series into a reduced low-dimensional space using a nonlinear manifold learning algorithm.
At the second step, we construct reduced-order regression models on the manifold to forecast the embedded dynamics.
At the final step, we lift the embedded time series back to the original high-dimensional space.
arXiv Detail & Related papers (2021-10-07T17:09:59Z) - Closed-form Continuous-Depth Models [99.40335716948101]
Continuous-depth neural models rely on advanced numerical differential equation solvers.
We present a new family of models, termed Closed-form Continuous-depth (CfC) networks, that are simple to describe and at least one order of magnitude faster.
arXiv Detail & Related papers (2021-06-25T22:08:51Z) - Hierarchical Deep Learning of Multiscale Differential Equation
Time-Steppers [5.6385744392820465]
We develop a hierarchy of deep neural network time-steppers to approximate the flow map of the dynamical system over a disparate range of time-scales.
The resulting model is purely data-driven and leverages features of the multiscale dynamics.
We benchmark our algorithm against state-of-the-art methods, such as LSTM, reservoir computing, and clockwork RNN.
arXiv Detail & Related papers (2020-08-22T07:16:53Z) - Convolutional Tensor-Train LSTM for Spatio-temporal Learning [116.24172387469994]
We propose a higher-order LSTM model that can efficiently learn long-term correlations in the video sequence.
This is accomplished through a novel tensor train module that performs prediction by combining convolutional features across time.
Our results achieve state-of-the-art performance-art in a wide range of applications and datasets.
arXiv Detail & Related papers (2020-02-21T05:00:01Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.