Incremental Spatial and Spectral Learning of Neural Operators for
Solving Large-Scale PDEs
- URL: http://arxiv.org/abs/2211.15188v4
- Date: Tue, 5 Mar 2024 04:42:21 GMT
- Title: Incremental Spatial and Spectral Learning of Neural Operators for
Solving Large-Scale PDEs
- Authors: Robert Joseph George, Jiawei Zhao, Jean Kossaifi, Zongyi Li, Anima
Anandkumar
- Abstract summary: We introduce the Incremental Fourier Neural Operator (iFNO), which progressively increases the number of frequency modes used by the model.
We show that iFNO reduces total training time while maintaining or improving generalization performance across various datasets.
Our method demonstrates a 10% lower testing error, using 20% fewer frequency modes compared to the existing Fourier Neural Operator, while also achieving a 30% faster training.
- Score: 86.35471039808023
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Fourier Neural Operators (FNO) offer a principled approach to solving
challenging partial differential equations (PDE) such as turbulent flows. At
the core of FNO is a spectral layer that leverages a discretization-convergent
representation in the Fourier domain, and learns weights over a fixed set of
frequencies. However, training FNO presents two significant challenges,
particularly in large-scale, high-resolution applications: (i) Computing
Fourier transform on high-resolution inputs is computationally intensive but
necessary since fine-scale details are needed for solving many PDEs, such as
fluid flows, (ii) selecting the relevant set of frequencies in the spectral
layers is challenging, and too many modes can lead to overfitting, while too
few can lead to underfitting. To address these issues, we introduce the
Incremental Fourier Neural Operator (iFNO), which progressively increases both
the number of frequency modes used by the model as well as the resolution of
the training data. We empirically show that iFNO reduces total training time
while maintaining or improving generalization performance across various
datasets. Our method demonstrates a 10% lower testing error, using 20% fewer
frequency modes compared to the existing Fourier Neural Operator, while also
achieving a 30% faster training.
Related papers
- DiffFNO: Diffusion Fourier Neural Operator [8.895165270489167]
We introduce DiffFNO, a novel diffusion framework for arbitrary-scale super-resolution strengthened by a Weighted Fourier Neural Operator (WFNO)
We show that DiffFNO achieves state-of-the-art (SOTA) results, outperforming existing methods across various scaling factors by a margin of 2 to 4 dB in PSNR.
Our approach sets a new standard in super-resolution, delivering both superior accuracy and computational efficiency.
arXiv Detail & Related papers (2024-11-15T03:14:11Z) - Toward a Better Understanding of Fourier Neural Operators from a Spectral Perspective [4.315136713224842]
SpecB-FNO achieves better prediction accuracy on diverse PDE applications, with an average improvement of 50%.
This paper offers empirical insights into FNO's difficulty with large kernels through spectral analysis.
arXiv Detail & Related papers (2024-04-10T17:58:04Z) - Multi-Grid Tensorized Fourier Neural Operator for High-Resolution PDEs [93.82811501035569]
We introduce a new data efficient and highly parallelizable operator learning approach with reduced memory requirement and better generalization.
MG-TFNO scales to large resolutions by leveraging local and global structures of full-scale, real-world phenomena.
We demonstrate superior performance on the turbulent Navier-Stokes equations where we achieve less than half the error with over 150x compression.
arXiv Detail & Related papers (2023-09-29T20:18:52Z) - Transform Once: Efficient Operator Learning in Frequency Domain [69.74509540521397]
We study deep neural networks designed to harness the structure in frequency domain for efficient learning of long-range correlations in space or time.
This work introduces a blueprint for frequency domain learning through a single transform: transform once (T1)
arXiv Detail & Related papers (2022-11-26T01:56:05Z) - Blending Neural Operators and Relaxation Methods in PDE Numerical Solvers [3.2712166248850685]
HINTS is a hybrid, iterative, numerical, and transferable solver for partial differential equations.
It balances the convergence behavior across the spectrum of eigenmodes by utilizing the spectral bias of DeepONet.
It is flexible with regards to discretizations, computational domain, and boundary conditions.
arXiv Detail & Related papers (2022-08-28T19:07:54Z) - Momentum Diminishes the Effect of Spectral Bias in Physics-Informed
Neural Networks [72.09574528342732]
Physics-informed neural network (PINN) algorithms have shown promising results in solving a wide range of problems involving partial differential equations (PDEs)
They often fail to converge to desirable solutions when the target function contains high-frequency features, due to a phenomenon known as spectral bias.
In the present work, we exploit neural tangent kernels (NTKs) to investigate the training dynamics of PINNs evolving under gradient descent with momentum (SGDM)
arXiv Detail & Related papers (2022-06-29T19:03:10Z) - Factorized Fourier Neural Operators [77.47313102926017]
The Factorized Fourier Neural Operator (F-FNO) is a learning-based method for simulating partial differential equations.
We show that our model maintains an error rate of 2% while still running an order of magnitude faster than a numerical solver.
arXiv Detail & Related papers (2021-11-27T03:34:13Z) - Learning Frequency Domain Approximation for Binary Neural Networks [68.79904499480025]
We propose to estimate the gradient of sign function in the Fourier frequency domain using the combination of sine functions for training BNNs.
The experiments on several benchmark datasets and neural architectures illustrate that the binary network learned using our method achieves the state-of-the-art accuracy.
arXiv Detail & Related papers (2021-03-01T08:25:26Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.