A Physics-informed Multi-resolution Neural Operator
- URL: http://arxiv.org/abs/2510.23810v1
- Date: Mon, 27 Oct 2025 19:50:02 GMT
- Title: A Physics-informed Multi-resolution Neural Operator
- Authors: Sumanta Roy, Bahador Bahmani, Ioannis G. Kevrekidis, Michael D. Shields,
- Abstract summary: We introduce a physics-informed operator learning approach by extending the Resolution Independent Neural Operator (RINO) framework to a fully data-free setup.<n>In this study, we introduce a physics-informed operator learning approach by extending the Resolution Independent Neural Operator (RINO) framework to a fully data-free setup.
- Score: 1.4174475093445233
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: The predictive accuracy of operator learning frameworks depends on the quality and quantity of available training data (input-output function pairs), often requiring substantial amounts of high-fidelity data, which can be challenging to obtain in some real-world engineering applications. These datasets may be unevenly discretized from one realization to another, with the grid resolution varying across samples. In this study, we introduce a physics-informed operator learning approach by extending the Resolution Independent Neural Operator (RINO) framework to a fully data-free setup, addressing both challenges simultaneously. Here, the arbitrarily (but sufficiently finely) discretized input functions are projected onto a latent embedding space (i.e., a vector space of finite dimensions), using pre-trained basis functions. The operator associated with the underlying partial differential equations (PDEs) is then approximated by a simple multi-layer perceptron (MLP), which takes as input a latent code along with spatiotemporal coordinates to produce the solution in the physical space. The PDEs are enforced via a finite difference solver in the physical space. The validation and performance of the proposed method are benchmarked on several numerical examples with multi-resolution data, where input functions are sampled at varying resolutions, including both coarse and fine discretizations.
Related papers
- Data-Driven Self-Supervised Learning for the Discovery of Solution Singularity for Partial Differential Equations [0.0]
The appearance of singularities in the function of interest constitutes a fundamental challenge in scientific computing.<n>We propose a self-supervised learning framework for estimating the location of the singularity.<n>Various experiments are presented to demonstrate the ability of the proposed approach to deal with input perturbation, label corruption, and different kinds of singularities.
arXiv Detail & Related papers (2025-06-29T17:39:41Z) - A Multimodal PDE Foundation Model for Prediction and Scientific Text Descriptions [13.48986376824454]
PDE foundation models utilize neural networks to train approximations to multiple differential equations simultaneously.<n>We propose a novel multimodal deep learning approach that leverages a transformer-based architecture to approximate solution operators.<n>Our approach generates interpretable scientific text descriptions, offering deeper insights into the underlying dynamics and solution properties.
arXiv Detail & Related papers (2025-02-09T20:50:28Z) - Diffeomorphic Latent Neural Operators for Data-Efficient Learning of Solutions to Partial Differential Equations [5.308435208832696]
A computed approximation of the solution operator to a system of partial differential equations (PDEs) is needed in various areas of science and engineering.<n>We propose that in order to learn a PDE solution operator that can generalize across multiple domains without needing to sample enough data expressive enough, we can train instead a latent neural operator on just a few ground truth solution fields.
arXiv Detail & Related papers (2024-11-27T03:16:00Z) - Physics-informed Discretization-independent Deep Compositional Operator Network [1.2430809884830318]
We introduce a novel physics-informed model architecture which can generalize to various discrete representations of PDE parameters and irregular domain shapes.
Inspired by deep operator neural networks, our model involves a discretization-independent learning of parameter embedding repeatedly.
Numerical results demonstrate the accuracy and efficiency of the proposed method.
arXiv Detail & Related papers (2024-04-21T12:41:30Z) - Pretraining Codomain Attention Neural Operators for Solving Multiphysics PDEs [85.40198664108624]
We propose Codomain Attention Neural Operator (CoDA-NO) to solve multiphysics problems with PDEs.
CoDA-NO tokenizes functions along the codomain or channel space, enabling self-supervised learning or pretraining of multiple PDE systems.
We find CoDA-NO to outperform existing methods by over 36% on complex downstream tasks with limited data.
arXiv Detail & Related papers (2024-03-19T08:56:20Z) - D2NO: Efficient Handling of Heterogeneous Input Function Spaces with
Distributed Deep Neural Operators [7.119066725173193]
We propose a novel distributed approach to deal with input functions that exhibit heterogeneous properties.
A central neural network is used to handle shared information across all output functions.
We demonstrate that the corresponding neural network is a universal approximator of continuous nonlinear operators.
arXiv Detail & Related papers (2023-10-29T03:29:59Z) - Score-based Diffusion Models in Function Space [137.70916238028306]
Diffusion models have recently emerged as a powerful framework for generative modeling.<n>This work introduces a mathematically rigorous framework called Denoising Diffusion Operators (DDOs) for training diffusion models in function space.<n>We show that the corresponding discretized algorithm generates accurate samples at a fixed cost independent of the data resolution.
arXiv Detail & Related papers (2023-02-14T23:50:53Z) - Solving High-Dimensional PDEs with Latent Spectral Models [74.1011309005488]
We present Latent Spectral Models (LSM) toward an efficient and precise solver for high-dimensional PDEs.
Inspired by classical spectral methods in numerical analysis, we design a neural spectral block to solve PDEs in the latent space.
LSM achieves consistent state-of-the-art and yields a relative gain of 11.5% averaged on seven benchmarks.
arXiv Detail & Related papers (2023-01-30T04:58:40Z) - Message Passing Neural PDE Solvers [60.77761603258397]
We build a neural message passing solver, replacing allally designed components in the graph with backprop-optimized neural function approximators.
We show that neural message passing solvers representationally contain some classical methods, such as finite differences, finite volumes, and WENO schemes.
We validate our method on various fluid-like flow problems, demonstrating fast, stable, and accurate performance across different domain topologies, equation parameters, discretizations, etc., in 1D and 2D.
arXiv Detail & Related papers (2022-02-07T17:47:46Z) - Physics-Informed Neural Operator for Learning Partial Differential
Equations [55.406540167010014]
PINO is the first hybrid approach incorporating data and PDE constraints at different resolutions to learn the operator.
The resulting PINO model can accurately approximate the ground-truth solution operator for many popular PDE families.
arXiv Detail & Related papers (2021-11-06T03:41:34Z) - Large-scale Neural Solvers for Partial Differential Equations [48.7576911714538]
Solving partial differential equations (PDE) is an indispensable part of many branches of science as many processes can be modelled in terms of PDEs.
Recent numerical solvers require manual discretization of the underlying equation as well as sophisticated, tailored code for distributed computing.
We examine the applicability of continuous, mesh-free neural solvers for partial differential equations, physics-informed neural networks (PINNs)
We discuss the accuracy of GatedPINN with respect to analytical solutions -- as well as state-of-the-art numerical solvers, such as spectral solvers.
arXiv Detail & Related papers (2020-09-08T13:26:51Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.