Operator Learning with Neural Fields: Tackling PDEs on General
Geometries
- URL: http://arxiv.org/abs/2306.07266v2
- Date: Thu, 30 Nov 2023 13:59:53 GMT
- Title: Operator Learning with Neural Fields: Tackling PDEs on General
Geometries
- Authors: Louis Serrano, Lise Le Boudec, Armand Kassa\"i Koupa\"i, Thomas X
Wang, Yuan Yin, Jean-No\"el Vittaut, Patrick Gallinari
- Abstract summary: Machine learning approaches for solving partial differential equations require learning mappings between function spaces.
New CORAL method leverages coordinate-based networks for PDEs on some general constraints.
- Score: 15.65577053925333
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Machine learning approaches for solving partial differential equations
require learning mappings between function spaces. While convolutional or graph
neural networks are constrained to discretized functions, neural operators
present a promising milestone toward mapping functions directly. Despite
impressive results they still face challenges with respect to the domain
geometry and typically rely on some form of discretization. In order to
alleviate such limitations, we present CORAL, a new method that leverages
coordinate-based networks for solving PDEs on general geometries. CORAL is
designed to remove constraints on the input mesh, making it applicable to any
spatial sampling and geometry. Its ability extends to diverse problem domains,
including PDE solving, spatio-temporal forecasting, and inverse problems like
geometric design. CORAL demonstrates robust performance across multiple
resolutions and performs well in both convex and non-convex domains, surpassing
or performing on par with state-of-the-art models.
Related papers
- Shape-informed surrogate models based on signed distance function domain encoding [8.052704959617207]
We propose a non-intrusive method to build surrogate models that approximate the solution of parameterized partial differential equations (PDEs)
Our approach is based on the combination of two neural networks (NNs)
arXiv Detail & Related papers (2024-09-19T01:47:04Z) - Spatio-spectral graph neural operator for solving computational mechanics problems on irregular domain and unstructured grid [0.9208007322096533]
We introduce Spatio-Spectral Graph Neural Operator (Sp$2$GNO) that integrates spatial and spectral GNNs effectively.
This framework mitigates the limitations of individual methods and enables the learning of solution operators across arbitrary geometries.
arXiv Detail & Related papers (2024-09-01T03:59:40Z) - Pretraining Codomain Attention Neural Operators for Solving Multiphysics PDEs [85.40198664108624]
We propose Codomain Attention Neural Operator (CoDA-NO) to solve multiphysics problems with PDEs.
CoDA-NO tokenizes functions along the codomain or channel space, enabling self-supervised learning or pretraining of multiple PDE systems.
We find CoDA-NO to outperform existing methods by over 36% on complex downstream tasks with limited data.
arXiv Detail & Related papers (2024-03-19T08:56:20Z) - A Stable and Scalable Method for Solving Initial Value PDEs with Neural
Networks [52.5899851000193]
We develop an ODE based IVP solver which prevents the network from getting ill-conditioned and runs in time linear in the number of parameters.
We show that current methods based on this approach suffer from two key issues.
First, following the ODE produces an uncontrolled growth in the conditioning of the problem, ultimately leading to unacceptably large numerical errors.
arXiv Detail & Related papers (2023-04-28T17:28:18Z) - Solving High-Dimensional PDEs with Latent Spectral Models [74.1011309005488]
We present Latent Spectral Models (LSM) toward an efficient and precise solver for high-dimensional PDEs.
Inspired by classical spectral methods in numerical analysis, we design a neural spectral block to solve PDEs in the latent space.
LSM achieves consistent state-of-the-art and yields a relative gain of 11.5% averaged on seven benchmarks.
arXiv Detail & Related papers (2023-01-30T04:58:40Z) - Neural PDE Solvers for Irregular Domains [25.673617202478606]
We present a framework to neurally solve partial differential equations over domains with irregularly shaped geometric boundaries.
Our network takes in the shape of the domain as an input and is able to generalize to novel (unseen) irregular domains.
arXiv Detail & Related papers (2022-11-07T00:00:30Z) - Message Passing Neural PDE Solvers [60.77761603258397]
We build a neural message passing solver, replacing allally designed components in the graph with backprop-optimized neural function approximators.
We show that neural message passing solvers representationally contain some classical methods, such as finite differences, finite volumes, and WENO schemes.
We validate our method on various fluid-like flow problems, demonstrating fast, stable, and accurate performance across different domain topologies, equation parameters, discretizations, etc., in 1D and 2D.
arXiv Detail & Related papers (2022-02-07T17:47:46Z) - Non-linear Independent Dual System (NIDS) for Discretization-independent
Surrogate Modeling over Complex Geometries [0.0]
Non-linear independent dual system (NIDS) is a deep learning surrogate model for discretization-independent, continuous representation of PDE solutions.
NIDS can be used for prediction over domains with complex, variable geometries and mesh topologies.
Test cases include a vehicle problem with complex geometry and data scarcity, enabled by a training method.
arXiv Detail & Related papers (2021-09-14T23:38:41Z) - Solving PDEs on Unknown Manifolds with Machine Learning [8.220217498103315]
This paper presents a mesh-free computational framework and machine learning theory for solving elliptic PDEs on unknown manifold.
We show that the proposed NN solver can robustly generalize the PDE on new data points with errors that are almost identical to generalizations on new data points.
arXiv Detail & Related papers (2021-06-12T03:55:15Z) - Multipole Graph Neural Operator for Parametric Partial Differential
Equations [57.90284928158383]
One of the main challenges in using deep learning-based methods for simulating physical systems is formulating physics-based data.
We propose a novel multi-level graph neural network framework that captures interaction at all ranges with only linear complexity.
Experiments confirm our multi-graph network learns discretization-invariant solution operators to PDEs and can be evaluated in linear time.
arXiv Detail & Related papers (2020-06-16T21:56:22Z) - Neural Operator: Graph Kernel Network for Partial Differential Equations [57.90284928158383]
This work is to generalize neural networks so that they can learn mappings between infinite-dimensional spaces (operators)
We formulate approximation of the infinite-dimensional mapping by composing nonlinear activation functions and a class of integral operators.
Experiments confirm that the proposed graph kernel network does have the desired properties and show competitive performance compared to the state of the art solvers.
arXiv Detail & Related papers (2020-03-07T01:56:20Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.