HAMLET: Graph Transformer Neural Operator for Partial Differential
Equations
- URL: http://arxiv.org/abs/2402.03541v1
- Date: Mon, 5 Feb 2024 21:55:24 GMT
- Title: HAMLET: Graph Transformer Neural Operator for Partial Differential
Equations
- Authors: Andrey Bryutkin, Jiahao Huang, Zhongying Deng, Guang Yang,
Carola-Bibiane Sch\"onlieb, Angelica Aviles-Rivero
- Abstract summary: We present a novel graph transformer framework, HAMLET, designed to address the challenges in solving partial differential equations (PDEs) using neural networks.
The framework uses graph transformers with modular input encoders to directly incorporate differential equation information into the solution process.
Notably, HAMLET scales effectively with increasing data complexity and noise, showcasing its robustness.
- Score: 6.699756195061548
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We present a novel graph transformer framework, HAMLET, designed to address
the challenges in solving partial differential equations (PDEs) using neural
networks. The framework uses graph transformers with modular input encoders to
directly incorporate differential equation information into the solution
process. This modularity enhances parameter correspondence control, making
HAMLET adaptable to PDEs of arbitrary geometries and varied input formats.
Notably, HAMLET scales effectively with increasing data complexity and noise,
showcasing its robustness. HAMLET is not just tailored to a single type of
physical simulation, but can be applied across various domains. Moreover, it
boosts model resilience and performance, especially in scenarios with limited
data. We demonstrate, through extensive experiments, that our framework is
capable of outperforming current techniques for PDEs.
Related papers
- AROMA: Preserving Spatial Structure for Latent PDE Modeling with Local Neural Fields [14.219495227765671]
We present AROMA, a framework designed to enhance the modeling of partial differential equations (PDEs) using local neural fields.
Our flexible encoder-decoder architecture can obtain smooth latent representations of spatial physical fields from a variety of data types.
By employing a diffusion-based formulation, we achieve greater stability and enable longer rollouts compared to conventional MSE training.
arXiv Detail & Related papers (2024-06-04T10:12:09Z) - Physics-informed Mesh-independent Deep Compositional Operator Network [1.2430809884830318]
We introduce a novel physics-informed model architecture which can generalize to parameter discretizations of variable size and irregular domain shapes.
Inspired by deep operator neural networks, our model involves a discretization-independent learning of parameter embedding repeatedly.
arXiv Detail & Related papers (2024-04-21T12:41:30Z) - PICL: Physics Informed Contrastive Learning for Partial Differential Equations [7.136205674624813]
We develop a novel contrastive pretraining framework that improves neural operator generalization across multiple governing equations simultaneously.
A combination of physics-informed system evolution and latent-space model output are anchored to input data and used in our distance function.
We find that physics-informed contrastive pretraining improves accuracy for the Fourier Neural Operator in fixed-future and autoregressive rollout tasks for the 1D and 2D Heat, Burgers', and linear advection equations.
arXiv Detail & Related papers (2024-01-29T17:32:22Z) - VTAE: Variational Transformer Autoencoder with Manifolds Learning [144.0546653941249]
Deep generative models have demonstrated successful applications in learning non-linear data distributions through a number of latent variables.
The nonlinearity of the generator implies that the latent space shows an unsatisfactory projection of the data space, which results in poor representation learning.
We show that geodesics and accurate computation can substantially improve the performance of deep generative models.
arXiv Detail & Related papers (2023-04-03T13:13:19Z) - Tunable Complexity Benchmarks for Evaluating Physics-Informed Neural
Networks on Coupled Ordinary Differential Equations [64.78260098263489]
In this work, we assess the ability of physics-informed neural networks (PINNs) to solve increasingly-complex coupled ordinary differential equations (ODEs)
We show that PINNs eventually fail to produce correct solutions to these benchmarks as their complexity increases.
We identify several reasons why this may be the case, including insufficient network capacity, poor conditioning of the ODEs, and high local curvature, as measured by the Laplacian of the PINN loss.
arXiv Detail & Related papers (2022-10-14T15:01:32Z) - Physics-constrained Unsupervised Learning of Partial Differential
Equations using Meshes [1.066048003460524]
Graph neural networks show promise in accurately representing irregularly meshed objects and learning their dynamics.
In this work, we represent meshes naturally as graphs, process these using Graph Networks, and formulate our physics-based loss to provide an unsupervised learning framework for partial differential equations (PDE)
Our framework will enable the application of PDE solvers in interactive settings, such as model-based control of soft-body deformations.
arXiv Detail & Related papers (2022-03-30T19:22:56Z) - Capturing Actionable Dynamics with Structured Latent Ordinary
Differential Equations [68.62843292346813]
We propose a structured latent ODE model that captures system input variations within its latent representation.
Building on a static variable specification, our model learns factors of variation for each input to the system, thus separating the effects of the system inputs in the latent space.
arXiv Detail & Related papers (2022-02-25T20:00:56Z) - Message Passing Neural PDE Solvers [60.77761603258397]
We build a neural message passing solver, replacing allally designed components in the graph with backprop-optimized neural function approximators.
We show that neural message passing solvers representationally contain some classical methods, such as finite differences, finite volumes, and WENO schemes.
We validate our method on various fluid-like flow problems, demonstrating fast, stable, and accurate performance across different domain topologies, equation parameters, discretizations, etc., in 1D and 2D.
arXiv Detail & Related papers (2022-02-07T17:47:46Z) - Equivariant vector field network for many-body system modeling [65.22203086172019]
Equivariant Vector Field Network (EVFN) is built on a novel equivariant basis and the associated scalarization and vectorization layers.
We evaluate our method on predicting trajectories of simulated Newton mechanics systems with both full and partially observed data.
arXiv Detail & Related papers (2021-10-26T14:26:25Z) - Neural TMDlayer: Modeling Instantaneous flow of features via SDE
Generators [37.92379202320938]
We study how differential equation (SDE) based ideas can inspire new modifications to existing algorithms for a set of problems in computer vision.
We show promising experiments on a number of vision tasks including few shot learning, point cloud transformers and deep variational segmentation.
arXiv Detail & Related papers (2021-08-19T19:54:04Z) - Large-scale Neural Solvers for Partial Differential Equations [48.7576911714538]
Solving partial differential equations (PDE) is an indispensable part of many branches of science as many processes can be modelled in terms of PDEs.
Recent numerical solvers require manual discretization of the underlying equation as well as sophisticated, tailored code for distributed computing.
We examine the applicability of continuous, mesh-free neural solvers for partial differential equations, physics-informed neural networks (PINNs)
We discuss the accuracy of GatedPINN with respect to analytical solutions -- as well as state-of-the-art numerical solvers, such as spectral solvers.
arXiv Detail & Related papers (2020-09-08T13:26:51Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.