Waveformer for modelling dynamical systems
- URL: http://arxiv.org/abs/2310.04990v1
- Date: Sun, 8 Oct 2023 03:34:59 GMT
- Title: Waveformer for modelling dynamical systems
- Authors: N Navaneeth and Souvik Chakraborty
- Abstract summary: We propose "waveformer", a novel operator learning approach for learning solutions of dynamical systems.
The proposed waveformer exploits wavelet transform to capture the spatial multi-scale behavior of the solution field and transformers.
We show that the proposed Waveformer can learn the solution operator with high accuracy, outperforming existing state-of-the-art operator learning algorithms by up to an order.
- Score: 1.0878040851638
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Neural operators have gained recognition as potent tools for learning
solutions of a family of partial differential equations. The state-of-the-art
neural operators excel at approximating the functional relationship between
input functions and the solution space, potentially reducing computational
costs and enabling real-time applications. However, they often fall short when
tackling time-dependent problems, particularly in delivering accurate long-term
predictions. In this work, we propose "waveformer", a novel operator learning
approach for learning solutions of dynamical systems. The proposed waveformer
exploits wavelet transform to capture the spatial multi-scale behavior of the
solution field and transformers for capturing the long horizon dynamics. We
present four numerical examples involving Burgers's equation, KS-equation,
Allen Cahn equation, and Navier Stokes equation to illustrate the efficacy of
the proposed approach. Results obtained indicate the capability of the proposed
waveformer in learning the solution operator and show that the proposed
Waveformer can learn the solution operator with high accuracy, outperforming
existing state-of-the-art operator learning algorithms by up to an order, with
its advantage particularly visible in the extrapolation region
Related papers
- An efficient wavelet-based physics-informed neural networks for singularly perturbed problems [0.0]
Physics-informed neural networks (PINNs) are a class of deep learning models that utilize physics as differential equations.
We present an efficient wavelet-based PINNs model to solve singularly perturbed differential equations.
The architecture allows the training process to search for a solution within wavelet space, making the process faster and more accurate.
arXiv Detail & Related papers (2024-09-18T10:01:37Z) - Linearization Turns Neural Operators into Function-Valued Gaussian Processes [23.85470417458593]
We introduce a new framework for approximate Bayesian uncertainty quantification in neural operators.
Our approach can be interpreted as a probabilistic analogue of the concept of currying from functional programming.
We showcase the efficacy of our approach through applications to different types of partial differential equations.
arXiv Detail & Related papers (2024-06-07T16:43:54Z) - Solving Poisson Equations using Neural Walk-on-Spheres [80.1675792181381]
We propose Neural Walk-on-Spheres (NWoS), a novel neural PDE solver for the efficient solution of high-dimensional Poisson equations.
We demonstrate the superiority of NWoS in accuracy, speed, and computational costs.
arXiv Detail & Related papers (2024-06-05T17:59:22Z) - Neural Operators for Accelerating Scientific Simulations and Design [85.89660065887956]
An AI framework, known as Neural Operators, presents a principled framework for learning mappings between functions defined on continuous domains.
Neural Operators can augment or even replace existing simulators in many applications, such as computational fluid dynamics, weather forecasting, and material modeling.
arXiv Detail & Related papers (2023-09-27T00:12:07Z) - Score-based Diffusion Models in Function Space [140.792362459734]
Diffusion models have recently emerged as a powerful framework for generative modeling.
We introduce a mathematically rigorous framework called Denoising Diffusion Operators (DDOs) for training diffusion models in function space.
We show that the corresponding discretized algorithm generates accurate samples at a fixed cost independent of the data resolution.
arXiv Detail & Related papers (2023-02-14T23:50:53Z) - Physics informed WNO [0.0]
We propose a physics-informed Wavelet Operator (WNO) for learning the solution operators of families of parametric partial differential equations (PDEs) without labeled training data.
The efficacy of the framework is validated and illustrated with four nonlinear neural systems relevant to various fields of engineering and science.
arXiv Detail & Related papers (2023-02-12T14:31:50Z) - Semi-supervised Learning of Partial Differential Operators and Dynamical
Flows [68.77595310155365]
We present a novel method that combines a hyper-network solver with a Fourier Neural Operator architecture.
We test our method on various time evolution PDEs, including nonlinear fluid flows in one, two, and three spatial dimensions.
The results show that the new method improves the learning accuracy at the time point of supervision point, and is able to interpolate and the solutions to any intermediate time.
arXiv Detail & Related papers (2022-07-28T19:59:14Z) - Wavelet neural operator: a neural operator for parametric partial
differential equations [0.0]
We introduce a novel operator learning algorithm referred to as the Wavelet Neural Operator (WNO)
WNO harnesses the superiority of the wavelets in time-frequency localization of the functions and enables accurate tracking of patterns in spatial domain.
The proposed approach is used to build a digital twin capable of predicting Earth's air temperature based on available historical data.
arXiv Detail & Related papers (2022-05-04T17:13:59Z) - Neural Galerkin Schemes with Active Learning for High-Dimensional
Evolution Equations [44.89798007370551]
This work proposes Neural Galerkin schemes based on deep learning that generate training data with active learning for numerically solving high-dimensional partial differential equations.
Neural Galerkin schemes build on the Dirac-Frenkel variational principle to train networks by minimizing the residual sequentially over time.
Our finding is that the active form of gathering training data of the proposed Neural Galerkin schemes is key for numerically realizing the expressive power of networks in high dimensions.
arXiv Detail & Related papers (2022-03-02T19:09:52Z) - Seismic wave propagation and inversion with Neural Operators [7.296366040398878]
We develop a prototype framework for learning general solutions using a recently developed machine learning paradigm called Neural Operator.
A trained Neural Operator can compute a solution in negligible time for any velocity structure or source location.
We illustrate the method with the 2D acoustic wave equation and demonstrate the method's applicability to seismic tomography.
arXiv Detail & Related papers (2021-08-11T19:17:39Z) - Incorporating NODE with Pre-trained Neural Differential Operator for
Learning Dynamics [73.77459272878025]
We propose to enhance the supervised signal in learning dynamics by pre-training a neural differential operator (NDO)
NDO is pre-trained on a class of symbolic functions, and it learns the mapping between the trajectory samples of these functions to their derivatives.
We provide theoretical guarantee on that the output of NDO can well approximate the ground truth derivatives by proper tuning the complexity of the library.
arXiv Detail & Related papers (2021-06-08T08:04:47Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.