FC-PINO: High Precision Physics-Informed Neural Operators via Fourier Continuation
- URL: http://arxiv.org/abs/2211.15960v2
- Date: Fri, 05 Sep 2025 01:37:54 GMT
- Title: FC-PINO: High Precision Physics-Informed Neural Operators via Fourier Continuation
- Authors: Adarsh Ganeshram, Haydn Maust, Valentin Duruisseaux, Zongyi Li, Yixuan Wang, Daniel Leibovici, Oscar Bruno, Thomas Hou, Anima Anandkumar,
- Abstract summary: We introduce the FC-PINO (Fourier-Continuation-based PINO) architecture which extends the accuracy and efficiency of PINO to non-periodic and non-smooth PDEs.<n>We demonstrate that standard PINO struggles to solve non-periodic and non-smooth PDEs with high precision, across challenging benchmarks.<n>In contrast, the proposed FC-PINO provides accurate, robust, and scalable solutions, substantially outperforming PINO alternatives.
- Score: 60.706803227003995
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The physics-informed neural operator (PINO) is a machine learning paradigm that has demonstrated promising results for learning solutions to partial differential equations (PDEs). It leverages the Fourier Neural Operator to learn solution operators in function spaces and leverages physics losses during training to penalize deviations from known physics laws. Spectral differentiation provides an efficient way to compute derivatives for the physics losses, but it inherently assumes periodicity. When applied to non-periodic functions, this assumption of periodicity can lead to significant errors, including Gibbs phenomena near domain boundaries which degrade the accuracy of both function representations and derivative computations, especially for higher order derivatives. To overcome this limitation, we introduce the FC-PINO (Fourier-Continuation-based PINO) architecture which extends the accuracy and efficiency of PINO and spectral differentiation to non-periodic and non-smooth PDEs. In FC-PINO, we propose integrating Fourier continuation into the PINO framework, and test two different continuation approaches: FC-Legendre and FC-Gram. By transforming non-periodic signals into periodic functions on extended domains in a well-conditioned manner, Fourier continuation enables fast and accurate derivative computations. This approach avoids the discretization sensitivity of finite differences and the memory overhead of automatic differentiation. We demonstrate that standard PINO struggles to solve non-periodic and non-smooth PDEs with high precision, across challenging benchmarks. In contrast, the proposed FC-PINO provides accurate, robust, and scalable solutions, substantially outperforming PINO alternatives, and demonstrating that Fourier continuation is critical for extending PINO to a wider range of PDE problems when high-precision solutions are needed.
Related papers
- Enhancing Physics-Informed Neural Networks with Domain-aware Fourier Features: Towards Improved Performance and Interpretable Results [0.10547191657550363]
Physics-Informed Neural Networks (PINNs) incorporate physics into neural networks by embedding partial differential equations (PDEs) into their loss function.<n>In this work, a novel modeling approach is proposed, which relies on the use of Domain-aware Fourier Features (DaFFs) for the positional encoding of the input space.
arXiv Detail & Related papers (2026-03-03T12:55:53Z) - Physics-Informed Laplace Neural Operator for Solving Partial Differential Equations [11.064132774859553]
Physics-Informed Laplace Neural Operator (PILNO) is a fast surrogate solver for partial differential equations.<n>It embeds physics into training through PDE, boundary condition, and initial condition residuals.<n>PILNO consistently improves accuracy in small-data settings, reduces run-to-run variability across random seeds, and achieves stronger generalization than purely data-driven baselines.
arXiv Detail & Related papers (2026-02-13T08:19:40Z) - KP-PINNs: Kernel Packet Accelerated Physics Informed Neural Networks [12.73776469872022]
We propose a new PINNs framework named Kernel Packet accelerated PINNs (KP-PINNs)<n>KP-PINNs gives a new expression of the loss function using the reproducing kernel Hilbert space (RKHS) norm and uses the Kernel Packet method to accelerate the computation.<n> Numerical experiments illustrate that KP-PINNs can solve differential equations effectively and efficiently.
arXiv Detail & Related papers (2025-06-10T08:33:34Z) - Sensitivity-Constrained Fourier Neural Operators for Forward and Inverse Problems in Parametric Differential Equations [6.900101619562999]
Parametric differential equations of the form du/dt = f(u, x, t, p) are fundamental in science and engineering.<n>Deep learning frameworks such as the Fourier Neural Operator (FNO) can efficiently approximate solutions, but struggle with inverse problems, sensitivity estimation (du/dp), and concept drift.<n>We address these limitations by introducing a sensitivity-based regularization strategy, called Sensitivity-Constrained Fourier Neural Operators (SC-FNO)<n>SC-FNO achieves high accuracy in predicting solution paths and consistently outperforms standard FNO and FNO with physics-informed regularization.
arXiv Detail & Related papers (2025-05-13T16:54:10Z) - Enabling Automatic Differentiation with Mollified Graph Neural Operators [73.52999622724101]
We propose the mollified graph neural operator ($m$GNO), the first method to leverage automatic differentiation and compute exact gradients on arbitrary geometries.<n>For a PDE example on regular grids, $m$GNO paired with autograd reduced the L2 relative data error by 20x compared to finite differences.<n>It can also solve PDEs on unstructured point clouds seamlessly, using physics losses only, at resolutions vastly lower than those needed for finite differences to be accurate enough.
arXiv Detail & Related papers (2025-04-11T06:16:30Z) - DimOL: Dimensional Awareness as A New 'Dimension' in Operator Learning [63.5925701087252]
We introduce DimOL (Dimension-aware Operator Learning), drawing insights from dimensional analysis.
To implement DimOL, we propose the ProdLayer, which can be seamlessly integrated into FNO-based and Transformer-based PDE solvers.
Empirically, DimOL models achieve up to 48% performance gain within the PDE datasets.
arXiv Detail & Related papers (2024-10-08T10:48:50Z) - Spectral-Refiner: Accurate Fine-Tuning of Spatiotemporal Fourier Neural Operator for Turbulent Flows [6.961408873053586]
Recent in operator-type neural networks have shown promising results in approximating Partial Differential Equations (PDEs)
These neural networks entail considerable training expenses, and may not always achieve the desired accuracy required in many scientific and engineering disciplines.
arXiv Detail & Related papers (2024-05-27T14:33:06Z) - Toward a Better Understanding of Fourier Neural Operators from a Spectral Perspective [4.315136713224842]
SpecB-FNO achieves better prediction accuracy on diverse PDE applications, with an average improvement of 50%.
This paper offers empirical insights into FNO's difficulty with large kernels through spectral analysis.
arXiv Detail & Related papers (2024-04-10T17:58:04Z) - Invertible Fourier Neural Operators for Tackling Both Forward and
Inverse Problems [18.48295539583625]
We propose an invertible Fourier Neural Operator (iFNO) that tackles both the forward and inverse problems.
We integrated a variational auto-encoder to capture the intrinsic structures within the input space and to enable posterior inference.
The evaluations on five benchmark problems have demonstrated the effectiveness of our approach.
arXiv Detail & Related papers (2024-02-18T22:16:43Z) - Deep Equilibrium Based Neural Operators for Steady-State PDEs [100.88355782126098]
We study the benefits of weight-tied neural network architectures for steady-state PDEs.
We propose FNO-DEQ, a deep equilibrium variant of the FNO architecture that directly solves for the solution of a steady-state PDE.
arXiv Detail & Related papers (2023-11-30T22:34:57Z) - Fourier-MIONet: Fourier-enhanced multiple-input neural operators for multiphase modeling of geological carbon sequestration [3.3058870667947646]
Multiphase flow in porous media is essential to understand CO$$ migration and pressure fields in the subsurface associated with GCS.
Here, we develop a Fourier-enhanced multiple-input neural operator (Fourier-MIONet) to learn the solution operator of the problem of multiphase flow in porous media.
Compared to the enhanced FNO (U-FNO), the proposed Fourier-MIONet has 90% fewer unknown parameters, and it can be trained in significantly less time.
arXiv Detail & Related papers (2023-03-08T18:20:56Z) - Incremental Spatial and Spectral Learning of Neural Operators for
Solving Large-Scale PDEs [86.35471039808023]
We introduce the Incremental Fourier Neural Operator (iFNO), which progressively increases the number of frequency modes used by the model.
We show that iFNO reduces total training time while maintaining or improving generalization performance across various datasets.
Our method demonstrates a 10% lower testing error, using 20% fewer frequency modes compared to the existing Fourier Neural Operator, while also achieving a 30% faster training.
arXiv Detail & Related papers (2022-11-28T09:57:15Z) - Fourier Neural Operator with Learned Deformations for PDEs on General Geometries [75.91055304134258]
We propose a new framework, viz., geo-FNO, to solve PDEs on arbitrary geometries.
Geo-FNO learns to deform the input (physical) domain, which may be irregular, into a latent space with a uniform grid.
We consider a variety of PDEs such as the Elasticity, Plasticity, Euler's, and Navier-Stokes equations, and both forward modeling and inverse design problems.
arXiv Detail & Related papers (2022-07-11T21:55:47Z) - Factorized Fourier Neural Operators [77.47313102926017]
The Factorized Fourier Neural Operator (F-FNO) is a learning-based method for simulating partial differential equations.
We show that our model maintains an error rate of 2% while still running an order of magnitude faster than a numerical solver.
arXiv Detail & Related papers (2021-11-27T03:34:13Z) - Physics-Informed Neural Operator for Learning Partial Differential
Equations [55.406540167010014]
PINO is the first hybrid approach incorporating data and PDE constraints at different resolutions to learn the operator.
The resulting PINO model can accurately approximate the ground-truth solution operator for many popular PDE families.
arXiv Detail & Related papers (2021-11-06T03:41:34Z) - Learning in Sinusoidal Spaces with Physics-Informed Neural Networks [22.47355575565345]
A physics-informed neural network (PINN) uses physics-augmented loss functions to ensure its output is consistent with fundamental physics laws.
It turns out to be difficult to train an accurate PINN model for many problems in practice.
arXiv Detail & Related papers (2021-09-20T07:42:41Z) - Learning Set Functions that are Sparse in Non-Orthogonal Fourier Bases [73.53227696624306]
We present a new family of algorithms for learning Fourier-sparse set functions.
In contrast to other work that focused on the Walsh-Hadamard transform, our novel algorithms operate with recently introduced non-orthogonal Fourier transforms.
We demonstrate effectiveness on several real-world applications.
arXiv Detail & Related papers (2020-10-01T14:31:59Z) - Fourier Neural Networks as Function Approximators and Differential
Equation Solvers [0.456877715768796]
The choice of activation and loss function yields results that replicate a Fourier series expansion closely.
We validate this FNN on naturally periodic smooth functions and on piecewise continuous periodic functions.
The main advantages of the current approach are the validity of the solution outside the training region, interpretability of the trained model, and simplicity of use.
arXiv Detail & Related papers (2020-05-27T00:30:58Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.