Group Equivariant Fourier Neural Operators for Partial Differential
Equations
- URL: http://arxiv.org/abs/2306.05697v2
- Date: Thu, 27 Jul 2023 09:24:31 GMT
- Title: Group Equivariant Fourier Neural Operators for Partial Differential
Equations
- Authors: Jacob Helwig, Xuan Zhang, Cong Fu, Jerry Kurtin, Stephan Wojtowytsch,
Shuiwang Ji
- Abstract summary: We consider solving partial differential equations with Fourier neural operators (FNOs)
In this work, we extend group convolutions to the frequency domain and design Fourier layers that are equivariant to rotations, translations, and reflections.
The resulting $G$-FNO architecture generalizes well across input resolutions and performs well in settings with varying levels of symmetry.
- Score: 33.71890280061319
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We consider solving partial differential equations (PDEs) with Fourier neural
operators (FNOs), which operate in the frequency domain. Since the laws of
physics do not depend on the coordinate system used to describe them, it is
desirable to encode such symmetries in the neural operator architecture for
better performance and easier learning. While encoding symmetries in the
physical domain using group theory has been studied extensively, how to capture
symmetries in the frequency domain is under-explored. In this work, we extend
group convolutions to the frequency domain and design Fourier layers that are
equivariant to rotations, translations, and reflections by leveraging the
equivariance property of the Fourier transform. The resulting $G$-FNO
architecture generalizes well across input resolutions and performs well in
settings with varying levels of symmetry. Our code is publicly available as
part of the AIRS library (https://github.com/divelab/AIRS).
Related papers
- FUTON: Fourier Tensor Network for Implicit Neural Representations [56.48739018255443]
Implicit neural representations (INRs) have emerged as powerful tools for encoding signals, yet dominant-based designs often suffer from slow convergence, overfitting to noise, and poor extrapolation.<n>We introduce FUTON, which models signals as generalized Fourier series whose coefficients are parameterized by a low-rank tensor decomposition.
arXiv Detail & Related papers (2026-02-13T19:31:44Z) - Fourier Learning Machines: Nonharmonic Fourier-Based Neural Networks for Scientific Machine Learning [0.5097809301149341]
We introduce the Fourier Learning Machine (FLM), a neural network (NN) architecture designed to represent a multidimensional nonharmonic Fourier series.<n>The FLM uses a simple feedforward structure with cosine activation functions to learn the frequencies, amplitudes, and phase shifts of the series as trainable parameters.<n> Computational experiments show that the performance of FLMs is comparable, and often superior, to that of established architectures like SIREN and vanilla feedforward NNs.
arXiv Detail & Related papers (2025-09-10T16:49:20Z) - Fourier Fingerprints of Ansatzes in Quantum Machine Learning [0.0]
We show the existence of correlations between the Fourier modes, which depend on the structure of the circuit.<n>For several popular ansatzes, we compute the Fourier coefficient correlations (FCCs) and construct the Fourier fingerprint.<n>We show how, for the problem of learning random Fourier series, the FCC correctly predicts relative performance of ansatzes whilst the widely-used expressibility metric does not.
arXiv Detail & Related papers (2025-08-28T15:00:37Z) - Hilbert Neural Operator: Operator Learning in the Analytic Signal Domain [0.0]
We introduce the textbfHilbert Neural Operator (HNO), a new neural operator architecture to address some advantages.<n>HNO operates by first mapping the input signal to its analytic representation via the Hilbert transform.<n>We hypothesize that this architecture enables HNO to model operators more effectively for causal, phase-sensitive, and non-stationary systems.
arXiv Detail & Related papers (2025-08-06T21:12:15Z) - Robustifying Fourier Features Embeddings for Implicit Neural Representations [25.725097757343367]
Implicit Neural Representations (INRs) employ neural networks to represent continuous functions by mapping coordinates to the corresponding values of the target function.
INRs face a challenge known as spectral bias when dealing with scenes containing varying frequencies.
We propose the use of multi-layer perceptrons (MLPs) without additive.
arXiv Detail & Related papers (2025-02-08T07:43:37Z) - Solving High Frequency and Multi-Scale PDEs with Gaussian Processes [18.190228010565367]
PINNs often struggle to solve high-frequency and multi-scale PDEs.
We resort to the Gaussian process (GP) framework to solve this problem.
We use Kronecker product properties and multilinear algebra to promote computational efficiency and scalability.
arXiv Detail & Related papers (2023-11-08T05:26:58Z) - Resolution-Invariant Image Classification based on Fourier Neural
Operators [1.3190581566723918]
We investigate the use of generalization Neural Operators (FNOs) for image classification in comparison to standard Convolutional Neural Networks (CNNs)
We derive the FNO architecture as an example for continuous and Fr'echet-differentiable neural operators on Lebesgue spaces.
arXiv Detail & Related papers (2023-04-02T10:23:36Z) - Deep Fourier Up-Sampling [100.59885545206744]
Up-sampling in the Fourier domain is more challenging as it does not follow such a local property.
We propose a theoretically sound Deep Fourier Up-Sampling (FourierUp) to solve these issues.
arXiv Detail & Related papers (2022-10-11T06:17:31Z) - Unified Fourier-based Kernel and Nonlinearity Design for Equivariant
Networks on Homogeneous Spaces [52.424621227687894]
We introduce a unified framework for group equivariant networks on homogeneous spaces.
We take advantage of the sparsity of Fourier coefficients of the lifted feature fields.
We show that other methods treating features as the Fourier coefficients in the stabilizer subgroup are special cases of our activation.
arXiv Detail & Related papers (2022-06-16T17:59:01Z) - Factorized Fourier Neural Operators [77.47313102926017]
The Factorized Fourier Neural Operator (F-FNO) is a learning-based method for simulating partial differential equations.
We show that our model maintains an error rate of 2% while still running an order of magnitude faster than a numerical solver.
arXiv Detail & Related papers (2021-11-27T03:34:13Z) - Frame Averaging for Invariant and Equivariant Network Design [50.87023773850824]
We introduce Frame Averaging (FA), a framework for adapting known (backbone) architectures to become invariant or equivariant to new symmetry types.
We show that FA-based models have maximal expressive power in a broad setting.
We propose a new class of universal Graph Neural Networks (GNNs), universal Euclidean motion invariant point cloud networks, and Euclidean motion invariant Message Passing (MP) GNNs.
arXiv Detail & Related papers (2021-10-07T11:05:23Z) - Seeing Implicit Neural Representations as Fourier Series [13.216389226310987]
Implicit Neural Representations (INR) use multilayer perceptrons to represent high-frequency functions in low-dimensional problem domains.
These representations achieved state-of-the-art results on tasks related to complex 3D objects and scenes.
This work analyzes the connection between the two methods and shows that a Fourier mapped perceptron is structurally like one hidden layer SIREN.
arXiv Detail & Related papers (2021-09-01T08:40:20Z) - Fourier Neural Operator for Parametric Partial Differential Equations [57.90284928158383]
We formulate a new neural operator by parameterizing the integral kernel directly in Fourier space.
We perform experiments on Burgers' equation, Darcy flow, and Navier-Stokes equation.
It is up to three orders of magnitude faster compared to traditional PDE solvers.
arXiv Detail & Related papers (2020-10-18T00:34:21Z) - Learning Set Functions that are Sparse in Non-Orthogonal Fourier Bases [73.53227696624306]
We present a new family of algorithms for learning Fourier-sparse set functions.
In contrast to other work that focused on the Walsh-Hadamard transform, our novel algorithms operate with recently introduced non-orthogonal Fourier transforms.
We demonstrate effectiveness on several real-world applications.
arXiv Detail & Related papers (2020-10-01T14:31:59Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.