Meta-Auto-Decoder for Solving Parametric Partial Differential Equations
- URL: http://arxiv.org/abs/2111.08823v1
- Date: Mon, 15 Nov 2021 02:51:42 GMT
- Title: Meta-Auto-Decoder for Solving Parametric Partial Differential Equations
- Authors: Xiang Huang, Zhanhong Ye, Hongsheng Liu, Beiji Shi, Zidong Wang, Kang
Yang, Yang Li, Bingya Weng, Min Wang, Haotian Chu, Jing Zhou, Fan Yu, Bei
Hua, Lei Chen, Bin Dong
- Abstract summary: Partial Differential Equations (PDEs) are ubiquitous in many disciplines of science and engineering and notoriously difficult to solve.
Our proposed approach, called Meta-Auto-Decoder (MAD), treats solving parametric PDEs as a meta-learning problem.
MAD exhibits faster convergence speed without losing the accuracy compared with other deep learning methods.
- Score: 32.46080264991759
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Partial Differential Equations (PDEs) are ubiquitous in many disciplines of
science and engineering and notoriously difficult to solve. In general,
closed-form solutions of PDEs are unavailable and numerical approximation
methods are computationally expensive. The parameters of PDEs are variable in
many applications, such as inverse problems, control and optimization, risk
assessment, and uncertainty quantification. In these applications, our goal is
to solve parametric PDEs rather than one instance of them. Our proposed
approach, called Meta-Auto-Decoder (MAD), treats solving parametric PDEs as a
meta-learning problem and utilizes the Auto-Decoder structure in
\cite{park2019deepsdf} to deal with different tasks/PDEs. Physics-informed
losses induced from the PDE governing equations and boundary conditions is used
as the training losses for different tasks. The goal of MAD is to learn a good
model initialization that can generalize across different tasks, and eventually
enables the unseen task to be learned faster. The inspiration of MAD comes from
(conjectured) low-dimensional structure of parametric PDE solutions and we
explain our approach from the perspective of manifold learning. Finally, we
demonstrate the power of MAD though extensive numerical studies, including
Burgers' equation, Laplace's equation and time-domain Maxwell's equations. MAD
exhibits faster convergence speed without losing the accuracy compared with
other deep learning methods.
Related papers
- Learning a Neural Solver for Parametric PDE to Enhance Physics-Informed Methods [14.791541465418263]
We propose learning a solver, i.e., solving partial differential equations (PDEs) using a physics-informed iterative algorithm trained on data.
Our method learns to condition a gradient descent algorithm that automatically adapts to each PDE instance.
We demonstrate the effectiveness of our method through empirical experiments on multiple datasets.
arXiv Detail & Related papers (2024-10-09T12:28:32Z) - Partial-differential-algebraic equations of nonlinear dynamics by Physics-Informed Neural-Network: (I) Operator splitting and framework assessment [51.3422222472898]
Several forms for constructing novel physics-informed-networks (PINN) for the solution of partial-differential-algebraic equations are proposed.
Among these novel methods are the PDE forms, which evolve from the lower-level form with fewer unknown dependent variables to higher-level form with more dependent variables.
arXiv Detail & Related papers (2024-07-13T22:48:17Z) - Unisolver: PDE-Conditional Transformers Are Universal PDE Solvers [55.0876373185983]
We present the Universal PDE solver (Unisolver) capable of solving a wide scope of PDEs.
Our key finding is that a PDE solution is fundamentally under the control of a series of PDE components.
Unisolver achieves consistent state-of-the-art results on three challenging large-scale benchmarks.
arXiv Detail & Related papers (2024-05-27T15:34:35Z) - Solving High-Dimensional PDEs with Latent Spectral Models [74.1011309005488]
We present Latent Spectral Models (LSM) toward an efficient and precise solver for high-dimensional PDEs.
Inspired by classical spectral methods in numerical analysis, we design a neural spectral block to solve PDEs in the latent space.
LSM achieves consistent state-of-the-art and yields a relative gain of 11.5% averaged on seven benchmarks.
arXiv Detail & Related papers (2023-01-30T04:58:40Z) - Meta-PDE: Learning to Solve PDEs Quickly Without a Mesh [24.572840023107574]
Partial differential equations (PDEs) are often computationally challenging to solve.
We present a meta-learning based method which learns to rapidly solve problems from a distribution of related PDEs.
arXiv Detail & Related papers (2022-11-03T06:17:52Z) - Learning differentiable solvers for systems with hard constraints [48.54197776363251]
We introduce a practical method to enforce partial differential equation (PDE) constraints for functions defined by neural networks (NNs)
We develop a differentiable PDE-constrained layer that can be incorporated into any NN architecture.
Our results show that incorporating hard constraints directly into the NN architecture achieves much lower test error when compared to training on an unconstrained objective.
arXiv Detail & Related papers (2022-07-18T15:11:43Z) - Learning to Solve PDE-constrained Inverse Problems with Graph Networks [51.89325993156204]
In many application domains across science and engineering, we are interested in solving inverse problems with constraints defined by a partial differential equation (PDE)
Here we explore GNNs to solve such PDE-constrained inverse problems.
We demonstrate computational speedups of up to 90x using GNNs compared to principled solvers.
arXiv Detail & Related papers (2022-06-01T18:48:01Z) - Lie Point Symmetry Data Augmentation for Neural PDE Solvers [69.72427135610106]
We present a method, which can partially alleviate this problem, by improving neural PDE solver sample complexity.
In the context of PDEs, it turns out that we are able to quantitatively derive an exhaustive list of data transformations.
We show how it can easily be deployed to improve neural PDE solver sample complexity by an order of magnitude.
arXiv Detail & Related papers (2022-02-15T18:43:17Z) - A composable autoencoder-based iterative algorithm for accelerating
numerical simulations [0.0]
CoAE-MLSim is an unsupervised, lower-dimensional, local method that is motivated from key ideas used in commercial PDE solvers.
It is tested for a variety of complex engineering cases to demonstrate its computational speed, accuracy, scalability, and generalization across different PDE conditions.
arXiv Detail & Related papers (2021-10-07T20:22:37Z) - DiscretizationNet: A Machine-Learning based solver for Navier-Stokes
Equations using Finite Volume Discretization [0.7366405857677226]
The goal of this work is to develop an ML-based PDE solver, that couples important characteristics of existing PDE solvers with Machine Learning technologies.
Our ML-solver, DiscretizationNet, employs a generative CNN-based encoder-decoder model with PDE variables as both input and output features.
A novel iterative capability is implemented during the network training to improve the stability and convergence of the ML-solver.
arXiv Detail & Related papers (2020-05-17T19:54:19Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.