Neural Partial Differential Equations with Functional Convolution
- URL: http://arxiv.org/abs/2303.07194v1
- Date: Fri, 10 Mar 2023 04:25:38 GMT
- Title: Neural Partial Differential Equations with Functional Convolution
- Authors: Ziqian Wu, Xingzhe He, Yijun Li, Cheng Yang, Rui Liu, Shiying Xiong,
Bo Zhu
- Abstract summary: We present a lightweighted neural PDE representation to discover the hidden structure and predict the solution of different nonlinear PDEs.
We leverage the prior of translational similarity'' of numerical PDE differential operators to drastically reduce the scale of learning model and training data.
- Score: 30.35306295442881
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We present a lightweighted neural PDE representation to discover the hidden
structure and predict the solution of different nonlinear PDEs. Our key idea is
to leverage the prior of ``translational similarity'' of numerical PDE
differential operators to drastically reduce the scale of learning model and
training data. We implemented three central network components, including a
neural functional convolution operator, a Picard forward iterative procedure,
and an adjoint backward gradient calculator. Our novel paradigm fully leverages
the multifaceted priors that stem from the sparse and smooth nature of the
physical PDE solution manifold and the various mature numerical techniques such
as adjoint solver, linearization, and iterative procedure to accelerate the
computation. We demonstrate the efficacy of our method by robustly discovering
the model and accurately predicting the solutions of various types of PDEs with
small-scale networks and training sets. We highlight that all the PDE examples
we showed were trained with up to 8 data samples and within 325 network
parameters.
Related papers
- DimOL: Dimensional Awareness as A New 'Dimension' in Operator Learning [63.5925701087252]
We introduce DimOL (Dimension-aware Operator Learning), drawing insights from dimensional analysis.
To implement DimOL, we propose the ProdLayer, which can be seamlessly integrated into FNO-based and Transformer-based PDE solvers.
Empirically, DimOL models achieve up to 48% performance gain within the PDE datasets.
arXiv Detail & Related papers (2024-10-08T10:48:50Z) - Self-supervised Pretraining for Partial Differential Equations [0.0]
We describe a novel approach to building a neural PDE solver leveraging recent advances in transformer based neural network architectures.
Our model can provide solutions for different values of PDE parameters without any need for retraining the network.
arXiv Detail & Related papers (2024-07-03T16:39:32Z) - Automatic Differentiation is Essential in Training Neural Networks for Solving Differential Equations [7.890817997914349]
Neural network-based approaches have recently shown significant promise in solving partial differential equations (PDEs) in science and engineering.
One advantage of the neural network methods for PDEs lies in its automatic differentiation (AD)
In this paper, we quantitatively demonstrate the advantage of AD in training neural networks.
arXiv Detail & Related papers (2024-05-23T02:01:05Z) - Pretraining Codomain Attention Neural Operators for Solving Multiphysics PDEs [85.40198664108624]
We propose Codomain Attention Neural Operator (CoDA-NO) to solve multiphysics problems with PDEs.
CoDA-NO tokenizes functions along the codomain or channel space, enabling self-supervised learning or pretraining of multiple PDE systems.
We find CoDA-NO to outperform existing methods by over 36% on complex downstream tasks with limited data.
arXiv Detail & Related papers (2024-03-19T08:56:20Z) - LatentPINNs: Generative physics-informed neural networks via a latent
representation learning [0.0]
We introduce latentPINN, a framework that utilizes latent representations of the PDE parameters as additional (to the coordinates) inputs into PINNs.
We use a two-stage training scheme in which the first stage, we learn the latent representations for the distribution of PDE parameters.
In the second stage, we train a physics-informed neural network over inputs given by randomly drawn samples from the coordinate space within the solution domain.
arXiv Detail & Related papers (2023-05-11T16:54:17Z) - Semi-supervised Learning of Partial Differential Operators and Dynamical
Flows [68.77595310155365]
We present a novel method that combines a hyper-network solver with a Fourier Neural Operator architecture.
We test our method on various time evolution PDEs, including nonlinear fluid flows in one, two, and three spatial dimensions.
The results show that the new method improves the learning accuracy at the time point of supervision point, and is able to interpolate and the solutions to any intermediate time.
arXiv Detail & Related papers (2022-07-28T19:59:14Z) - Message Passing Neural PDE Solvers [60.77761603258397]
We build a neural message passing solver, replacing allally designed components in the graph with backprop-optimized neural function approximators.
We show that neural message passing solvers representationally contain some classical methods, such as finite differences, finite volumes, and WENO schemes.
We validate our method on various fluid-like flow problems, demonstrating fast, stable, and accurate performance across different domain topologies, equation parameters, discretizations, etc., in 1D and 2D.
arXiv Detail & Related papers (2022-02-07T17:47:46Z) - Adversarial Multi-task Learning Enhanced Physics-informed Neural
Networks for Solving Partial Differential Equations [9.823102211212582]
We introduce the novel approach of employing multi-task learning techniques, the uncertainty-weighting loss and the gradients surgery, in the context of learning PDE solutions.
In the experiments, our proposed methods are found to be effective and reduce the error on the unseen data points as compared to the previous approaches.
arXiv Detail & Related papers (2021-04-29T13:17:46Z) - Fourier Neural Operator for Parametric Partial Differential Equations [57.90284928158383]
We formulate a new neural operator by parameterizing the integral kernel directly in Fourier space.
We perform experiments on Burgers' equation, Darcy flow, and Navier-Stokes equation.
It is up to three orders of magnitude faster compared to traditional PDE solvers.
arXiv Detail & Related papers (2020-10-18T00:34:21Z) - Large-scale Neural Solvers for Partial Differential Equations [48.7576911714538]
Solving partial differential equations (PDE) is an indispensable part of many branches of science as many processes can be modelled in terms of PDEs.
Recent numerical solvers require manual discretization of the underlying equation as well as sophisticated, tailored code for distributed computing.
We examine the applicability of continuous, mesh-free neural solvers for partial differential equations, physics-informed neural networks (PINNs)
We discuss the accuracy of GatedPINN with respect to analytical solutions -- as well as state-of-the-art numerical solvers, such as spectral solvers.
arXiv Detail & Related papers (2020-09-08T13:26:51Z) - Solver-in-the-Loop: Learning from Differentiable Physics to Interact
with Iterative PDE-Solvers [26.444103444634994]
We show that machine learning can improve the solution accuracy by correcting for effects not captured by the discretized PDE.
We find that previously used learning approaches are significantly outperformed by methods that integrate the solver into the training loop.
This provides the model with realistic input distributions that take previous corrections into account.
arXiv Detail & Related papers (2020-06-30T18:00:03Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.