Modeling Dynamics over Meshes with Gauge Equivariant Nonlinear Message
Passing
- URL: http://arxiv.org/abs/2310.19589v2
- Date: Fri, 3 Nov 2023 02:20:30 GMT
- Title: Modeling Dynamics over Meshes with Gauge Equivariant Nonlinear Message
Passing
- Authors: Jung Yeon Park, Lawson L.S. Wong, Robin Walters
- Abstract summary: We introduce a new gauge equivariant architecture using nonlinear message passing.
Our architecture achieves higher performance than either convolutional or attentional networks on domains with highly complex and nonlinear dynamics.
- Score: 28.534857322609543
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Data over non-Euclidean manifolds, often discretized as surface meshes,
naturally arise in computer graphics and biological and physical systems. In
particular, solutions to partial differential equations (PDEs) over manifolds
depend critically on the underlying geometry. While graph neural networks have
been successfully applied to PDEs, they do not incorporate surface geometry and
do not consider local gauge symmetries of the manifold. Alternatively, recent
works on gauge equivariant convolutional and attentional architectures on
meshes leverage the underlying geometry but underperform in modeling surface
PDEs with complex nonlinear dynamics. To address these issues, we introduce a
new gauge equivariant architecture using nonlinear message passing. Our novel
architecture achieves higher performance than either convolutional or
attentional networks on domains with highly complex and nonlinear dynamics.
However, similar to the non-mesh case, design trade-offs favor convolutional,
attentional, or message passing networks for different tasks; we investigate in
which circumstances our message passing method provides the most benefit.
Related papers
- Geometric Trajectory Diffusion Models [58.853975433383326]
Generative models have shown great promise in generating 3D geometric systems.
Existing approaches only operate on static structures, neglecting the fact that physical systems are always dynamic in nature.
We propose geometric trajectory diffusion models (GeoTDM), the first diffusion model for modeling the temporal distribution of 3D geometric trajectories.
arXiv Detail & Related papers (2024-10-16T20:36:41Z) - A hybrid numerical methodology coupling Reduced Order Modeling and Graph Neural Networks for non-parametric geometries: applications to structural dynamics problems [0.0]
This work introduces a new approach for accelerating the numerical analysis of time-domain partial differential equations (PDEs) governing complex physical systems.
The methodology is based on a combination of a classical reduced-order modeling (ROM) framework and recently-parametric Graph Neural Networks (GNNs)
arXiv Detail & Related papers (2024-06-03T08:51:25Z) - Flatten Anything: Unsupervised Neural Surface Parameterization [76.4422287292541]
We introduce the Flatten Anything Model (FAM), an unsupervised neural architecture to achieve global free-boundary surface parameterization.
Compared with previous methods, our FAM directly operates on discrete surface points without utilizing connectivity information.
Our FAM is fully-automated without the need for pre-cutting and can deal with highly-complex topologies.
arXiv Detail & Related papers (2024-05-23T14:39:52Z) - A Survey of Geometric Graph Neural Networks: Data Structures, Models and
Applications [67.33002207179923]
This paper presents a survey of data structures, models, and applications related to geometric GNNs.
We provide a unified view of existing models from the geometric message passing perspective.
We also summarize the applications as well as the related datasets to facilitate later research for methodology development and experimental evaluation.
arXiv Detail & Related papers (2024-03-01T12:13:04Z) - A Hitchhiker's Guide to Geometric GNNs for 3D Atomic Systems [87.30652640973317]
Recent advances in computational modelling of atomic systems represent them as geometric graphs with atoms embedded as nodes in 3D Euclidean space.
Geometric Graph Neural Networks have emerged as the preferred machine learning architecture powering applications ranging from protein structure prediction to molecular simulations and material generation.
This paper provides a comprehensive and self-contained overview of the field of Geometric GNNs for 3D atomic systems.
arXiv Detail & Related papers (2023-12-12T18:44:19Z) - Physics-informed neural networks for transformed geometries and
manifolds [0.0]
We propose a novel method for integrating geometric transformations within PINNs to robustly accommodate geometric variations.
We demonstrate the enhanced flexibility over traditional PINNs, especially under geometric variations.
The proposed framework presents an outlook for training deep neural operators over parametrized geometries.
arXiv Detail & Related papers (2023-11-27T15:47:33Z) - Graph Convolutional Networks for Simulating Multi-phase Flow and Transport in Porous Media [0.0]
Data-driven surrogate modeling provides inexpensive alternatives to high-fidelity numerical simulators.
CNNs are powerful in approximating partial differential equation solutions, but it remains challenging for CNNs to handle irregular and unstructured simulation meshes.
We construct surrogate models based on Graph Convolutional Networks (GCNs) to approximate the spatial-temporal solutions of multi-phase flow and transport processes in porous media.
arXiv Detail & Related papers (2023-07-10T09:59:35Z) - Operator Learning with Neural Fields: Tackling PDEs on General
Geometries [15.65577053925333]
Machine learning approaches for solving partial differential equations require learning mappings between function spaces.
New CORAL method leverages coordinate-based networks for PDEs on some general constraints.
arXiv Detail & Related papers (2023-06-12T17:52:39Z) - VTAE: Variational Transformer Autoencoder with Manifolds Learning [144.0546653941249]
Deep generative models have demonstrated successful applications in learning non-linear data distributions through a number of latent variables.
The nonlinearity of the generator implies that the latent space shows an unsatisfactory projection of the data space, which results in poor representation learning.
We show that geodesics and accurate computation can substantially improve the performance of deep generative models.
arXiv Detail & Related papers (2023-04-03T13:13:19Z) - Towards a mathematical understanding of learning from few examples with
nonlinear feature maps [68.8204255655161]
We consider the problem of data classification where the training set consists of just a few data points.
We reveal key relationships between the geometry of an AI model's feature space, the structure of the underlying data distributions, and the model's generalisation capabilities.
arXiv Detail & Related papers (2022-11-07T14:52:58Z) - Non-linear Independent Dual System (NIDS) for Discretization-independent
Surrogate Modeling over Complex Geometries [0.0]
Non-linear independent dual system (NIDS) is a deep learning surrogate model for discretization-independent, continuous representation of PDE solutions.
NIDS can be used for prediction over domains with complex, variable geometries and mesh topologies.
Test cases include a vehicle problem with complex geometry and data scarcity, enabled by a training method.
arXiv Detail & Related papers (2021-09-14T23:38:41Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.