GIT-Net: Generalized Integral Transform for Operator Learning
- URL: http://arxiv.org/abs/2312.02450v1
- Date: Tue, 5 Dec 2023 03:03:54 GMT
- Title: GIT-Net: Generalized Integral Transform for Operator Learning
- Authors: Chao Wang and Alexandre Hoang Thiery
- Abstract summary: This article introduces GIT-Net, a deep neural network architecture for approximating Partial Differential Equation (PDE) operators.
GIT-Net harnesses the fact that differential operators commonly used for defining PDEs can often be represented parsimoniously when expressed in specialized functional bases.
Numerical experiments demonstrate that GIT-Net is a competitive neural network operator, exhibiting small test errors and low evaluations across a range of PDE problems.
- Score: 58.13313857603536
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: This article introduces GIT-Net, a deep neural network architecture for
approximating Partial Differential Equation (PDE) operators, inspired by
integral transform operators. GIT-NET harnesses the fact that differential
operators commonly used for defining PDEs can often be represented
parsimoniously when expressed in specialized functional bases (e.g., Fourier
basis). Unlike rigid integral transforms, GIT-Net parametrizes adaptive
generalized integral transforms with deep neural networks. When compared to
several recently proposed alternatives, GIT-Net's computational and memory
requirements scale gracefully with mesh discretizations, facilitating its
application to PDE problems on complex geometries. Numerical experiments
demonstrate that GIT-Net is a competitive neural network operator, exhibiting
small test errors and low evaluations across a range of PDE problems. This
stands in contrast to existing neural network operators, which typically excel
in just one of these areas.
Related papers
- DimOL: Dimensional Awareness as A New 'Dimension' in Operator Learning [63.5925701087252]
We introduce DimOL (Dimension-aware Operator Learning), drawing insights from dimensional analysis.
To implement DimOL, we propose the ProdLayer, which can be seamlessly integrated into FNO-based and Transformer-based PDE solvers.
Empirically, DimOL models achieve up to 48% performance gain within the PDE datasets.
arXiv Detail & Related papers (2024-10-08T10:48:50Z) - Neural Control Variates with Automatic Integration [49.91408797261987]
This paper proposes a novel approach to construct learnable parametric control variates functions from arbitrary neural network architectures.
We use the network to approximate the anti-derivative of the integrand.
We apply our method to solve partial differential equations using the Walk-on-sphere algorithm.
arXiv Detail & Related papers (2024-09-23T06:04:28Z) - Neural Green's Operators for Parametric Partial Differential Equations [0.0]
This work introduces neural Green's operators (NGOs), a novel neural operator network architecture that learns the solution operator for a parametric family of linear partial differential equations (PDEs)
NGOs are similar to deep operator networks (DeepONets) and variationally mimetic operator networks (VarMiONs)
arXiv Detail & Related papers (2024-06-04T00:02:52Z) - Functional SDE approximation inspired by a deep operator network
architecture [0.0]
A novel approach to approximate solutions of Differential Equations (SDEs) by Deep Neural Networks is derived and analysed.
The architecture is inspired by notion of Deep Operator Networks (DeepONets), which is based on operator learning in terms of a reduced basis also represented in the network.
The proposed SDEONet architecture aims to alleviate the issue of exponential complexity by learning an optimal sparse truncation of the Wiener chaos expansion.
arXiv Detail & Related papers (2024-02-05T14:12:35Z) - Energy-Dissipative Evolutionary Deep Operator Neural Networks [12.764072441220172]
Energy-Dissipative Evolutionary Deep Operator Neural Network is an operator learning neural network.
It is designed to seed numerical solutions for a class of partial differential equations.
arXiv Detail & Related papers (2023-06-09T22:11:16Z) - DOSnet as a Non-Black-Box PDE Solver: When Deep Learning Meets Operator
Splitting [12.655884541938656]
We develop a learning-based PDE solver, which we name Deep Operator-Splitting Network (DOSnet)
DOSnet is constructed from the physical rules and operators governing the underlying dynamics contains learnable parameters.
We train and validate it on several types of operator-decomposable differential equations.
arXiv Detail & Related papers (2022-12-11T18:23:56Z) - Tunable Complexity Benchmarks for Evaluating Physics-Informed Neural
Networks on Coupled Ordinary Differential Equations [64.78260098263489]
In this work, we assess the ability of physics-informed neural networks (PINNs) to solve increasingly-complex coupled ordinary differential equations (ODEs)
We show that PINNs eventually fail to produce correct solutions to these benchmarks as their complexity increases.
We identify several reasons why this may be the case, including insufficient network capacity, poor conditioning of the ODEs, and high local curvature, as measured by the Laplacian of the PINN loss.
arXiv Detail & Related papers (2022-10-14T15:01:32Z) - Learning the Solution Operator of Boundary Value Problems using Graph
Neural Networks [0.0]
We design a general solution operator for two different time-independent PDEs using graph neural networks (GNNs) and spectral graph convolutions.
We train the networks on simulated data from a finite elements solver on a variety of shapes and inhomogeneities.
We find that training on a diverse dataset with lots of variation in the finite element meshes is a key ingredient for achieving good generalization results.
arXiv Detail & Related papers (2022-06-28T15:39:06Z) - LordNet: An Efficient Neural Network for Learning to Solve Parametric Partial Differential Equations without Simulated Data [47.49194807524502]
We propose LordNet, a tunable and efficient neural network for modeling entanglements.
The experiments on solving Poisson's equation and (2D and 3D) Navier-Stokes equation demonstrate that the long-range entanglements can be well modeled by the LordNet.
arXiv Detail & Related papers (2022-06-19T14:41:08Z) - Multipole Graph Neural Operator for Parametric Partial Differential
Equations [57.90284928158383]
One of the main challenges in using deep learning-based methods for simulating physical systems is formulating physics-based data.
We propose a novel multi-level graph neural network framework that captures interaction at all ranges with only linear complexity.
Experiments confirm our multi-graph network learns discretization-invariant solution operators to PDEs and can be evaluated in linear time.
arXiv Detail & Related papers (2020-06-16T21:56:22Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.