Diffeomorphic Latent Neural Operators for Data-Efficient Learning of Solutions to Partial Differential Equations
- URL: http://arxiv.org/abs/2411.18014v2
- Date: Fri, 29 Nov 2024 18:57:12 GMT
- Title: Diffeomorphic Latent Neural Operators for Data-Efficient Learning of Solutions to Partial Differential Equations
- Authors: Zan Ahmad, Shiyi Chen, Minglang Yin, Avisha Kumar, Nicolas Charon, Natalia Trayanova, Mauro Maggioni,
- Abstract summary: A computed approximation of the solution operator to a system of partial differential equations (PDEs) is needed in various areas of science and engineering.
We propose that in order to learn a PDE solution operator that can generalize across multiple domains without needing to sample enough data expressive enough, we can train instead a latent neural operator on just a few ground truth solution fields.
- Score: 5.308435208832696
- License:
- Abstract: A computed approximation of the solution operator to a system of partial differential equations (PDEs) is needed in various areas of science and engineering. Neural operators have been shown to be quite effective at predicting these solution generators after training on high-fidelity ground truth data (e.g. numerical simulations). However, in order to generalize well to unseen spatial domains, neural operators must be trained on an extensive amount of geometrically varying data samples that may not be feasible to acquire or simulate in certain contexts (e.g., patient-specific medical data, large-scale computationally intensive simulations.) We propose that in order to learn a PDE solution operator that can generalize across multiple domains without needing to sample enough data expressive enough for all possible geometries, we can train instead a latent neural operator on just a few ground truth solution fields diffeomorphically mapped from different geometric/spatial domains to a fixed reference configuration. Furthermore, the form of the solutions is dependent on the choice of mapping to and from the reference domain. We emphasize that preserving properties of the differential operator when constructing these mappings can significantly reduce the data requirement for achieving an accurate model due to the regularity of the solution fields that the latent neural operator is training on. We provide motivating numerical experimentation that demonstrates an extreme case of this consideration by exploiting the conformal invariance of the Laplacian
Related papers
- A Multimodal PDE Foundation Model for Prediction and Scientific Text Descriptions [13.48986376824454]
PDE foundation models utilize neural networks to train approximations to multiple differential equations simultaneously.
We propose a novel multimodal deep learning approach that leverages a transformer-based architecture to approximate solution operators.
Our approach generates interpretable scientific text descriptions, offering deeper insights into the underlying dynamics and solution properties.
arXiv Detail & Related papers (2025-02-09T20:50:28Z) - DeltaPhi: Learning Physical Trajectory Residual for PDE Solving [54.13671100638092]
We propose and formulate the Physical Trajectory Residual Learning (DeltaPhi)
We learn the surrogate model for the residual operator mapping based on existing neural operator networks.
We conclude that, compared to direct learning, physical residual learning is preferred for PDE solving.
arXiv Detail & Related papers (2024-06-14T07:45:07Z) - Reference Neural Operators: Learning the Smooth Dependence of Solutions of PDEs on Geometric Deformations [13.208548352092455]
For partial differential equations on domains of arbitrary shapes, existing works of neural operators attempt to learn a mapping from geometries to solutions.
We propose reference neural operators (RNO) to learn the smooth dependence of solutions on geometric deformations.
RNO outperforms baseline models in accuracy by a large lead and achieves up to 80% error reduction.
arXiv Detail & Related papers (2024-05-27T06:50:17Z) - Diffusion models as probabilistic neural operators for recovering unobserved states of dynamical systems [49.2319247825857]
We show that diffusion-based generative models exhibit many properties favourable for neural operators.
We propose to train a single model adaptable to multiple tasks, by alternating between the tasks during training.
arXiv Detail & Related papers (2024-05-11T21:23:55Z) - Learning Only On Boundaries: a Physics-Informed Neural operator for
Solving Parametric Partial Differential Equations in Complex Geometries [10.250994619846416]
We present a novel physics-informed neural operator method to solve parametrized boundary value problems without labeled data.
Our numerical experiments show the effectiveness of parametrized complex geometries and unbounded problems.
arXiv Detail & Related papers (2023-08-24T17:29:57Z) - VTAE: Variational Transformer Autoencoder with Manifolds Learning [144.0546653941249]
Deep generative models have demonstrated successful applications in learning non-linear data distributions through a number of latent variables.
The nonlinearity of the generator implies that the latent space shows an unsatisfactory projection of the data space, which results in poor representation learning.
We show that geodesics and accurate computation can substantially improve the performance of deep generative models.
arXiv Detail & Related papers (2023-04-03T13:13:19Z) - Score-based Diffusion Models in Function Space [137.70916238028306]
Diffusion models have recently emerged as a powerful framework for generative modeling.
This work introduces a mathematically rigorous framework called Denoising Diffusion Operators (DDOs) for training diffusion models in function space.
We show that the corresponding discretized algorithm generates accurate samples at a fixed cost independent of the data resolution.
arXiv Detail & Related papers (2023-02-14T23:50:53Z) - Learning the Solution Operator of Boundary Value Problems using Graph
Neural Networks [0.0]
We design a general solution operator for two different time-independent PDEs using graph neural networks (GNNs) and spectral graph convolutions.
We train the networks on simulated data from a finite elements solver on a variety of shapes and inhomogeneities.
We find that training on a diverse dataset with lots of variation in the finite element meshes is a key ingredient for achieving good generalization results.
arXiv Detail & Related papers (2022-06-28T15:39:06Z) - Large-scale Neural Solvers for Partial Differential Equations [48.7576911714538]
Solving partial differential equations (PDE) is an indispensable part of many branches of science as many processes can be modelled in terms of PDEs.
Recent numerical solvers require manual discretization of the underlying equation as well as sophisticated, tailored code for distributed computing.
We examine the applicability of continuous, mesh-free neural solvers for partial differential equations, physics-informed neural networks (PINNs)
We discuss the accuracy of GatedPINN with respect to analytical solutions -- as well as state-of-the-art numerical solvers, such as spectral solvers.
arXiv Detail & Related papers (2020-09-08T13:26:51Z) - Multipole Graph Neural Operator for Parametric Partial Differential
Equations [57.90284928158383]
One of the main challenges in using deep learning-based methods for simulating physical systems is formulating physics-based data.
We propose a novel multi-level graph neural network framework that captures interaction at all ranges with only linear complexity.
Experiments confirm our multi-graph network learns discretization-invariant solution operators to PDEs and can be evaluated in linear time.
arXiv Detail & Related papers (2020-06-16T21:56:22Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.