Local Convolution Enhanced Global Fourier Neural Operator For Multiscale
Dynamic Spaces Prediction
- URL: http://arxiv.org/abs/2311.12902v1
- Date: Tue, 21 Nov 2023 11:04:13 GMT
- Title: Local Convolution Enhanced Global Fourier Neural Operator For Multiscale
Dynamic Spaces Prediction
- Authors: Xuanle Zhao, Yue Sun, Tielin Zhang, Bo Xu
- Abstract summary: We propose a novel hierarchical neural operator that integrates improved Fourier layers with attention mechanisms.
We reach superior performance in existing PDE benchmarks, especially equations characterized by rapid coefficient variations.
- Score: 19.56790304454538
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Neural operators extend the capabilities of traditional neural networks by
allowing them to handle mappings between function spaces for the purpose of
solving partial differential equations (PDEs). One of the most notable methods
is the Fourier Neural Operator (FNO), which is inspired by Green's function
method and approximate operator kernel directly in the frequency domain. In
this work, we focus on predicting multiscale dynamic spaces, which is
equivalent to solving multiscale PDEs. Multiscale PDEs are characterized by
rapid coefficient changes and solution space oscillations, which are crucial
for modeling atmospheric convection and ocean circulation. To solve this
problem, models should have the ability to capture rapid changes and process
them at various scales. However, the FNO only approximates kernels in the
low-frequency domain, which is insufficient when solving multiscale PDEs. To
address this challenge, we propose a novel hierarchical neural operator that
integrates improved Fourier layers with attention mechanisms, aiming to capture
all details and handle them at various scales. These mechanisms complement each
other in the frequency domain and encourage the model to solve multiscale
problems. We perform experiments on dynamic spaces governed by forward and
reverse problems of multiscale elliptic equations, Navier-Stokes equations and
some other physical scenarios, and reach superior performance in existing PDE
benchmarks, especially equations characterized by rapid coefficient variations.
Related papers
- Physics-embedded Fourier Neural Network for Partial Differential Equations [35.41134465442465]
We introduce Physics-embedded Fourier Neural Networks (PeFNN) with flexible and explainable error.
PeFNN is designed to enforce momentum conservation and yields interpretable nonlinear expressions.
We demonstrate its outstanding performance for challenging real-world applications such as large-scale flood simulations.
arXiv Detail & Related papers (2024-07-15T18:30:39Z) - Solving Poisson Equations using Neural Walk-on-Spheres [80.1675792181381]
We propose Neural Walk-on-Spheres (NWoS), a novel neural PDE solver for the efficient solution of high-dimensional Poisson equations.
We demonstrate the superiority of NWoS in accuracy, speed, and computational costs.
arXiv Detail & Related papers (2024-06-05T17:59:22Z) - Coupled Multiwavelet Neural Operator Learning for Coupled Partial
Differential Equations [13.337268390844745]
We propose a textitcoupled multiwavelets neural operator (CMWNO) learning scheme by decoupling the coupled integral kernels.
The proposed model achieves significantly higher accuracy compared to previous learning-based solvers.
arXiv Detail & Related papers (2023-03-04T03:06:47Z) - Solving High-Dimensional PDEs with Latent Spectral Models [74.1011309005488]
We present Latent Spectral Models (LSM) toward an efficient and precise solver for high-dimensional PDEs.
Inspired by classical spectral methods in numerical analysis, we design a neural spectral block to solve PDEs in the latent space.
LSM achieves consistent state-of-the-art and yields a relative gain of 11.5% averaged on seven benchmarks.
arXiv Detail & Related papers (2023-01-30T04:58:40Z) - Incremental Spatial and Spectral Learning of Neural Operators for
Solving Large-Scale PDEs [86.35471039808023]
We introduce the Incremental Fourier Neural Operator (iFNO), which progressively increases the number of frequency modes used by the model.
We show that iFNO reduces total training time while maintaining or improving generalization performance across various datasets.
Our method demonstrates a 10% lower testing error, using 20% fewer frequency modes compared to the existing Fourier Neural Operator, while also achieving a 30% faster training.
arXiv Detail & Related papers (2022-11-28T09:57:15Z) - Mitigating spectral bias for the multiscale operator learning [14.404769413313371]
We propose a hierarchical attention neural operator (HANO) inspired by the hierarchical matrix approach.
HANO features a scale-adaptive interaction range and self-attentions over a hierarchy of levels, enabling nested feature computation with controllable linear cost.
Our numerical experiments demonstrate that HANO outperforms state-of-the-art (SOTA) methods for representative multiscale problems.
arXiv Detail & Related papers (2022-10-19T21:09:29Z) - Towards Multi-spatiotemporal-scale Generalized PDE Modeling [4.924631198058705]
We make a comparison between various FNO and U-Net like approaches on fluid mechanics problems in both vorticity-stream and velocity function form.
We show promising results on generalization to different PDE parameters and time-scales with a single surrogate model.
arXiv Detail & Related papers (2022-09-30T17:40:05Z) - Message Passing Neural PDE Solvers [60.77761603258397]
We build a neural message passing solver, replacing allally designed components in the graph with backprop-optimized neural function approximators.
We show that neural message passing solvers representationally contain some classical methods, such as finite differences, finite volumes, and WENO schemes.
We validate our method on various fluid-like flow problems, demonstrating fast, stable, and accurate performance across different domain topologies, equation parameters, discretizations, etc., in 1D and 2D.
arXiv Detail & Related papers (2022-02-07T17:47:46Z) - On the eigenvector bias of Fourier feature networks: From regression to
solving multi-scale PDEs with physics-informed neural networks [0.0]
We show that neural networks (PINNs) struggle in cases where the target functions to be approximated exhibit high-frequency or multi-scale features.
We construct novel architectures that employ multi-scale random observational features and justify how such coordinate embedding layers can lead to robust and accurate PINN models.
arXiv Detail & Related papers (2020-12-18T04:19:30Z) - Fourier Neural Operator for Parametric Partial Differential Equations [57.90284928158383]
We formulate a new neural operator by parameterizing the integral kernel directly in Fourier space.
We perform experiments on Burgers' equation, Darcy flow, and Navier-Stokes equation.
It is up to three orders of magnitude faster compared to traditional PDE solvers.
arXiv Detail & Related papers (2020-10-18T00:34:21Z) - Multipole Graph Neural Operator for Parametric Partial Differential
Equations [57.90284928158383]
One of the main challenges in using deep learning-based methods for simulating physical systems is formulating physics-based data.
We propose a novel multi-level graph neural network framework that captures interaction at all ranges with only linear complexity.
Experiments confirm our multi-graph network learns discretization-invariant solution operators to PDEs and can be evaluated in linear time.
arXiv Detail & Related papers (2020-06-16T21:56:22Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.