Domain Agnostic Fourier Neural Operators
- URL: http://arxiv.org/abs/2305.00478v2
- Date: Sat, 28 Oct 2023 16:52:49 GMT
- Title: Domain Agnostic Fourier Neural Operators
- Authors: Ning Liu, Siavash Jafarzadeh, Yue Yu
- Abstract summary: We introduce domain agnostic Fourier neural operator (DAFNO) for learning surrogates with irregular geometries and evolving domains.
The key idea is to incorporate a smoothed characteristic function in the integral layer architecture of FNOs.
DAFNO has achieved state-of-the-art accuracy as compared to baseline neural operator models.
- Score: 15.29112632863168
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Fourier neural operators (FNOs) can learn highly nonlinear mappings between
function spaces, and have recently become a popular tool for learning responses
of complex physical systems. However, to achieve good accuracy and efficiency,
FNOs rely on the Fast Fourier transform (FFT), which is restricted to modeling
problems on rectangular domains. To lift such a restriction and permit FFT on
irregular geometries as well as topology changes, we introduce domain agnostic
Fourier neural operator (DAFNO), a novel neural operator architecture for
learning surrogates with irregular geometries and evolving domains. The key
idea is to incorporate a smoothed characteristic function in the integral layer
architecture of FNOs, and leverage FFT to achieve rapid computations, in such a
way that the geometric information is explicitly encoded in the architecture.
In our empirical evaluation, DAFNO has achieved state-of-the-art accuracy as
compared to baseline neural operator models on two benchmark datasets of
material modeling and airfoil simulation. To further demonstrate the capability
and generalizability of DAFNO in handling complex domains with topology
changes, we consider a brittle material fracture evolution problem. With only
one training crack simulation sample, DAFNO has achieved generalizability to
unseen loading scenarios and substantially different crack patterns from the
trained scenario. Our code and data accompanying this paper are available at
https://github.com/ningliu-iga/DAFNO.
Related papers
- DimOL: Dimensional Awareness as A New 'Dimension' in Operator Learning [63.5925701087252]
We introduce DimOL (Dimension-aware Operator Learning), drawing insights from dimensional analysis.
To implement DimOL, we propose the ProdLayer, which can be seamlessly integrated into FNO-based and Transformer-based PDE solvers.
Empirically, DimOL models achieve up to 48% performance gain within the PDE datasets.
arXiv Detail & Related papers (2024-10-08T10:48:50Z) - A domain decomposition-based autoregressive deep learning model for unsteady and nonlinear partial differential equations [2.7755345520127936]
We propose a domain-decomposition-based deep learning (DL) framework, named CoMLSim, for accurately modeling unsteady and nonlinear partial differential equations (PDEs)
The framework consists of two key components: (a) a convolutional neural network (CNN)-based autoencoder architecture and (b) an autoregressive model composed of fully connected layers.
arXiv Detail & Related papers (2024-08-26T17:50:47Z) - Discretization Error of Fourier Neural Operators [5.121705282248479]
Operator learning is a variant of machine learning that is designed to approximate maps between function spaces from data.
The Fourier Neural Operator (FNO) is a common model architecture used for operator learning.
arXiv Detail & Related papers (2024-05-03T16:28:05Z) - Spherical Fourier Neural Operators: Learning Stable Dynamics on the
Sphere [53.63505583883769]
We introduce Spherical FNOs (SFNOs) for learning operators on spherical geometries.
SFNOs have important implications for machine learning-based simulation of climate dynamics.
arXiv Detail & Related papers (2023-06-06T16:27:17Z) - Bounding The Rademacher Complexity of Fourier Neural Operator [3.4960814625958787]
A Fourier neural operator (FNO) is one of the physics-inspired machine learning methods.
In this study, we investigated the bounding of the Rademacher complexity of the FNO based on specific group norms.
In addition, we investigated the correlation between the empirical generalization error and the proposed capacity of FNO.
arXiv Detail & Related papers (2022-09-12T11:11:43Z) - Fourier Neural Operator with Learned Deformations for PDEs on General Geometries [75.91055304134258]
We propose a new framework, viz., geo-FNO, to solve PDEs on arbitrary geometries.
Geo-FNO learns to deform the input (physical) domain, which may be irregular, into a latent space with a uniform grid.
We consider a variety of PDEs such as the Elasticity, Plasticity, Euler's, and Navier-Stokes equations, and both forward modeling and inverse design problems.
arXiv Detail & Related papers (2022-07-11T21:55:47Z) - Factorized Fourier Neural Operators [77.47313102926017]
The Factorized Fourier Neural Operator (F-FNO) is a learning-based method for simulating partial differential equations.
We show that our model maintains an error rate of 2% while still running an order of magnitude faster than a numerical solver.
arXiv Detail & Related papers (2021-11-27T03:34:13Z) - U-FNO -- an enhanced Fourier neural operator based-deep learning model
for multiphase flow [43.572675744374415]
We present U-FNO, an enhanced Fourier neural operator for solving the multiphase flow problem.
We show that the U-FNO architecture has the advantages of both traditional CNN and original FNO, providing significantly more accurate and efficient performance.
The trained U-FNO provides gas saturation and pressure buildup predictions with a 10,000 times speedup compared to traditional numerical simulators.
arXiv Detail & Related papers (2021-09-03T17:52:25Z) - Rank-R FNN: A Tensor-Based Learning Model for High-Order Data
Classification [69.26747803963907]
Rank-R Feedforward Neural Network (FNN) is a tensor-based nonlinear learning model that imposes Canonical/Polyadic decomposition on its parameters.
First, it handles inputs as multilinear arrays, bypassing the need for vectorization, and can thus fully exploit the structural information along every data dimension.
We establish the universal approximation and learnability properties of Rank-R FNN, and we validate its performance on real-world hyperspectral datasets.
arXiv Detail & Related papers (2021-04-11T16:37:32Z) - A Meta-Learning Approach to the Optimal Power Flow Problem Under
Topology Reconfigurations [69.73803123972297]
We propose a DNN-based OPF predictor that is trained using a meta-learning (MTL) approach.
The developed OPF-predictor is validated through simulations using benchmark IEEE bus systems.
arXiv Detail & Related papers (2020-12-21T17:39:51Z) - Fourier Neural Networks as Function Approximators and Differential
Equation Solvers [0.456877715768796]
The choice of activation and loss function yields results that replicate a Fourier series expansion closely.
We validate this FNN on naturally periodic smooth functions and on piecewise continuous periodic functions.
The main advantages of the current approach are the validity of the solution outside the training region, interpretability of the trained model, and simplicity of use.
arXiv Detail & Related papers (2020-05-27T00:30:58Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.