Equivariant Neural Operator Learning with Graphon Convolution
- URL: http://arxiv.org/abs/2311.10908v1
- Date: Fri, 17 Nov 2023 23:28:22 GMT
- Title: Equivariant Neural Operator Learning with Graphon Convolution
- Authors: Chaoran Cheng, Jian Peng
- Abstract summary: We propose a general architecture that combines the learning coefficient scheme with a residual operator layer for learning mappings between continuous functions in the 3D Euclidean space.
- Score: 12.059797539633506
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We propose a general architecture that combines the coefficient learning
scheme with a residual operator layer for learning mappings between continuous
functions in the 3D Euclidean space. Our proposed model is guaranteed to
achieve SE(3)-equivariance by design. From the graph spectrum view, our method
can be interpreted as convolution on graphons (dense graphs with infinitely
many nodes), which we term InfGCN. By leveraging both the continuous graphon
structure and the discrete graph structure of the input data, our model can
effectively capture the geometric information while preserving equivariance.
Through extensive experiments on large-scale electron density datasets, we
observed that our model significantly outperformed the current state-of-the-art
architectures. Multiple ablation studies were also carried out to demonstrate
the effectiveness of the proposed architecture.
Related papers
- Improving embedding of graphs with missing data by soft manifolds [51.425411400683565]
The reliability of graph embeddings depends on how much the geometry of the continuous space matches the graph structure.
We introduce a new class of manifold, named soft manifold, that can solve this situation.
Using soft manifold for graph embedding, we can provide continuous spaces to pursue any task in data analysis over complex datasets.
arXiv Detail & Related papers (2023-11-29T12:48:33Z) - Graph Neural Network for Stress Predictions in Stiffened Panels Under
Uniform Loading [0.0]
Graph neural network (GNN) is a type of neural network which processes data that can be represented as graphs.
In this study, we propose a novel graph embedding technique for efficient representation of 3D stiffened panels.
arXiv Detail & Related papers (2023-09-22T17:34:20Z) - GrannGAN: Graph annotation generative adversarial networks [72.66289932625742]
We consider the problem of modelling high-dimensional distributions and generating new examples of data with complex relational feature structure coherent with a graph skeleton.
The model we propose tackles the problem of generating the data features constrained by the specific graph structure of each data point by splitting the task into two phases.
In the first it models the distribution of features associated with the nodes of the given graph, in the second it complements the edge features conditionally on the node features.
arXiv Detail & Related papers (2022-12-01T11:49:07Z) - Latent Graph Inference using Product Manifolds [0.0]
We generalize the discrete Differentiable Graph Module (dDGM) for latent graph learning.
Our novel approach is tested on a wide range of datasets, and outperforms the original dDGM model.
arXiv Detail & Related papers (2022-11-26T22:13:06Z) - Graph Kernel Neural Networks [53.91024360329517]
We propose to use graph kernels, i.e. kernel functions that compute an inner product on graphs, to extend the standard convolution operator to the graph domain.
This allows us to define an entirely structural model that does not require computing the embedding of the input graph.
Our architecture allows to plug-in any type of graph kernels and has the added benefit of providing some interpretability.
arXiv Detail & Related papers (2021-12-14T14:48:08Z) - Dense Graph Convolutional Neural Networks on 3D Meshes for 3D Object
Segmentation and Classification [0.0]
We present new designs of graph convolutional neural networks (GCNs) on 3D meshes for 3D object classification and segmentation.
We use the faces of the mesh as basic processing units and represent a 3D mesh as a graph where each node corresponds to a face.
arXiv Detail & Related papers (2021-06-30T02:17:16Z) - Learning non-Gaussian graphical models via Hessian scores and triangular
transport [6.308539010172309]
We propose an algorithm for learning the Markov structure of continuous and non-Gaussian distributions.
Our algorithm SING estimates the density using a deterministic coupling, induced by a triangular transport map, and iteratively exploits sparse structure in the map to reveal sparsity in the graph.
arXiv Detail & Related papers (2021-01-08T16:42:42Z) - Mix Dimension in Poincar\'{e} Geometry for 3D Skeleton-based Action
Recognition [57.98278794950759]
Graph Convolutional Networks (GCNs) have already demonstrated their powerful ability to model the irregular data.
We present a novel spatial-temporal GCN architecture which is defined via the Poincar'e geometry.
We evaluate our method on two current largest scale 3D datasets.
arXiv Detail & Related papers (2020-07-30T18:23:18Z) - Tensor Graph Convolutional Networks for Multi-relational and Robust
Learning [74.05478502080658]
This paper introduces a tensor-graph convolutional network (TGCN) for scalable semi-supervised learning (SSL) from data associated with a collection of graphs, that are represented by a tensor.
The proposed architecture achieves markedly improved performance relative to standard GCNs, copes with state-of-the-art adversarial attacks, and leads to remarkable SSL performance over protein-to-protein interaction networks.
arXiv Detail & Related papers (2020-03-15T02:33:21Z) - Embedding Graph Auto-Encoder for Graph Clustering [90.8576971748142]
Graph auto-encoder (GAE) models are based on semi-supervised graph convolution networks (GCN)
We design a specific GAE-based model for graph clustering to be consistent with the theory, namely Embedding Graph Auto-Encoder (EGAE)
EGAE consists of one encoder and dual decoders.
arXiv Detail & Related papers (2020-02-20T09:53:28Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.