Learning Semilinear Neural Operators : A Unified Recursive Framework For Prediction And Data Assimilation
- URL: http://arxiv.org/abs/2402.15656v2
- Date: Fri, 15 Mar 2024 06:47:48 GMT
- Title: Learning Semilinear Neural Operators : A Unified Recursive Framework For Prediction And Data Assimilation
- Authors: Ashutosh Singh, Ricardo Augusto Borsoi, Deniz Erdogmus, Tales Imbiriba,
- Abstract summary: We propose a learning-based state-space approach to compute solution operators to infinite-dimensional semilinear PDEs.
We develop a flexible method that allows for both prediction and data assimilation by combining prediction and correction operations.
We show through experiments on Kuramoto-Sivashinsky, Navier-Stokes and Korteweg-de Vries equations that the proposed model is robust to noise and can leverage arbitrary amounts of measurements to correct its prediction over a long time horizon with little computational overhead.
- Score: 21.206744437644982
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Recent advances in the theory of Neural Operators (NOs) have enabled fast and accurate computation of the solutions to complex systems described by partial differential equations (PDEs). Despite their great success, current NO-based solutions face important challenges when dealing with spatio-temporal PDEs over long time scales. Specifically, the current theory of NOs does not present a systematic framework to perform data assimilation and efficiently correct the evolution of PDE solutions over time based on sparsely sampled noisy measurements. In this paper, we propose a learning-based state-space approach to compute the solution operators to infinite-dimensional semilinear PDEs. Exploiting the structure of semilinear PDEs and the theory of nonlinear observers in function spaces, we develop a flexible recursive method that allows for both prediction and data assimilation by combining prediction and correction operations. The proposed framework is capable of producing fast and accurate predictions over long time horizons, dealing with irregularly sampled noisy measurements to correct the solution, and benefits from the decoupling between the spatial and temporal dynamics of this class of PDEs. We show through experiments on the Kuramoto-Sivashinsky, Navier-Stokes and Korteweg-de Vries equations that the proposed model is robust to noise and can leverage arbitrary amounts of measurements to correct its prediction over a long time horizon with little computational overhead.
Related papers
- Monte Carlo Neural PDE Solver for Learning PDEs via Probabilistic Representation [59.45669299295436]
We propose a Monte Carlo PDE solver for training unsupervised neural solvers.
We use the PDEs' probabilistic representation, which regards macroscopic phenomena as ensembles of random particles.
Our experiments on convection-diffusion, Allen-Cahn, and Navier-Stokes equations demonstrate significant improvements in accuracy and efficiency.
arXiv Detail & Related papers (2023-02-10T08:05:19Z) - Self-Consistent Velocity Matching of Probability Flows [22.2542921090435]
We present a discretization-free scalable framework for solving a class of partial differential equations (PDEs)
The main observation is that the time-varying velocity field of the PDE solution needs to be self-consistent.
We use an iterative formulation with a biased gradient estimator that bypasses significant computational obstacles with strong empirical performance.
arXiv Detail & Related papers (2023-01-31T16:17:18Z) - Solving High-Dimensional PDEs with Latent Spectral Models [74.1011309005488]
We present Latent Spectral Models (LSM) toward an efficient and precise solver for high-dimensional PDEs.
Inspired by classical spectral methods in numerical analysis, we design a neural spectral block to solve PDEs in the latent space.
LSM achieves consistent state-of-the-art and yields a relative gain of 11.5% averaged on seven benchmarks.
arXiv Detail & Related papers (2023-01-30T04:58:40Z) - MAgNet: Mesh Agnostic Neural PDE Solver [68.8204255655161]
Climate predictions require fine-temporal resolutions to resolve all turbulent scales in the fluid simulations.
Current numerical model solveers PDEs on grids that are too coarse (3km to 200km on each side)
We design a novel architecture that predicts the spatially continuous solution of a PDE given a spatial position query.
arXiv Detail & Related papers (2022-10-11T14:52:20Z) - FaDIn: Fast Discretized Inference for Hawkes Processes with General
Parametric Kernels [82.53569355337586]
This work offers an efficient solution to temporal point processes inference using general parametric kernels with finite support.
The method's effectiveness is evaluated by modeling the occurrence of stimuli-induced patterns from brain signals recorded with magnetoencephalography (MEG)
Results show that the proposed approach leads to an improved estimation of pattern latency than the state-of-the-art.
arXiv Detail & Related papers (2022-10-10T12:35:02Z) - Continuous PDE Dynamics Forecasting with Implicit Neural Representations [24.460010868042758]
We introduce a new data-driven, approach to PDEs flow with continuous-time dynamics of spatially continuous functions.
This is achieved by embedding spatial extrapolation independently of their discretization via Implicit Neural Representations.
It extrapolates at arbitrary spatial and temporal locations; it can learn sparse grids or irregular data at test time, it generalizes to new grids or resolutions.
arXiv Detail & Related papers (2022-09-29T15:17:50Z) - Deep Convolutional Architectures for Extrapolative Forecast in
Time-dependent Flow Problems [0.0]
Deep learning techniques are employed to model the system dynamics for advection dominated problems.
These models take as input a sequence of high-fidelity vector solutions for consecutive time-steps obtained from the PDEs.
Non-intrusive reduced-order modelling techniques such as deep auto-encoder networks are utilized to compress the high-fidelity snapshots.
arXiv Detail & Related papers (2022-09-18T03:45:56Z) - Semi-supervised Learning of Partial Differential Operators and Dynamical
Flows [68.77595310155365]
We present a novel method that combines a hyper-network solver with a Fourier Neural Operator architecture.
We test our method on various time evolution PDEs, including nonlinear fluid flows in one, two, and three spatial dimensions.
The results show that the new method improves the learning accuracy at the time point of supervision point, and is able to interpolate and the solutions to any intermediate time.
arXiv Detail & Related papers (2022-07-28T19:59:14Z) - Learning to Accelerate Partial Differential Equations via Latent Global
Evolution [64.72624347511498]
Latent Evolution of PDEs (LE-PDE) is a simple, fast and scalable method to accelerate the simulation and inverse optimization of PDEs.
We introduce new learning objectives to effectively learn such latent dynamics to ensure long-term stability.
We demonstrate up to 128x reduction in the dimensions to update, and up to 15x improvement in speed, while achieving competitive accuracy.
arXiv Detail & Related papers (2022-06-15T17:31:24Z) - Long-time integration of parametric evolution equations with
physics-informed DeepONets [0.0]
We introduce an effective framework for learning infinite-dimensional operators that map random initial conditions to associated PDE solutions within a short time interval.
Global long-time predictions across a range of initial conditions can be then obtained by iteratively evaluating the trained model.
This introduces a new approach to temporal domain decomposition that is shown to be effective in performing accurate long-time simulations.
arXiv Detail & Related papers (2021-06-09T20:46:17Z) - STENCIL-NET: Data-driven solution-adaptive discretization of partial
differential equations [2.362412515574206]
We present STENCIL-NET, an artificial neural network architecture for data-driven learning of problem- and resolution-specific local discretizations of nonlinear PDEs.
Knowing the actual PDE is not necessary, as solution data is sufficient to train the network to learn the discrete operators.
A once-trained STENCIL-NET model can be used to predict solutions of the PDE on larger domains and for longer times than it was trained for.
arXiv Detail & Related papers (2021-01-15T15:43:41Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.