Isometric Transformation Invariant and Equivariant Graph Convolutional
Networks
- URL: http://arxiv.org/abs/2005.06316v4
- Date: Wed, 10 Mar 2021 12:41:51 GMT
- Title: Isometric Transformation Invariant and Equivariant Graph Convolutional
Networks
- Authors: Masanobu Horie, Naoki Morita, Toshiaki Hishinuma, Yu Ihara, Naoto
Mitsume
- Abstract summary: We propose a set of transformation invariant and equivariant models based on graph convolutional networks, called IsoGCNs.
We demonstrate that the proposed model has a competitive performance compared to state-of-the-art methods on tasks related to geometrical and physical simulation data.
- Score: 5.249805590164902
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Graphs are one of the most important data structures for representing
pairwise relations between objects. Specifically, a graph embedded in a
Euclidean space is essential to solving real problems, such as physical
simulations. A crucial requirement for applying graphs in Euclidean spaces to
physical simulations is learning and inferring the isometric transformation
invariant and equivariant features in a computationally efficient manner. In
this paper, we propose a set of transformation invariant and equivariant models
based on graph convolutional networks, called IsoGCNs. We demonstrate that the
proposed model has a competitive performance compared to state-of-the-art
methods on tasks related to geometrical and physical simulation data. Moreover,
the proposed model can scale up to graphs with 1M vertices and conduct an
inference faster than a conventional finite element analysis, which the
existing equivariant models cannot achieve.
Related papers
- Relaxing Continuous Constraints of Equivariant Graph Neural Networks for Physical Dynamics Learning [39.25135680793105]
We propose a general Discrete Equivariant Graph Neural Network (DEGNN) that guarantees equivariance to a given discrete point group.
Specifically, we show that such discrete equivariant message passing could be constructed by transforming geometric features into permutation-invariant embeddings.
We show that DEGNN is data efficient, learning with less data, and can generalize across scenarios such as unobserved orientation.
arXiv Detail & Related papers (2024-06-24T03:37:51Z) - Similarity Equivariant Graph Neural Networks for Homogenization of Metamaterials [3.6443770850509423]
Soft, porous mechanical metamaterials exhibit pattern transformations that may have important applications in soft robotics, sound reduction and biomedicine.
We develop a machine learning-based approach that scales favorably to serve as a surrogate model.
We show that this network is more accurate and data-efficient than graph neural networks with fewer symmetries.
arXiv Detail & Related papers (2024-04-26T12:30:32Z) - Geometric Neural Diffusion Processes [55.891428654434634]
We extend the framework of diffusion models to incorporate a series of geometric priors in infinite-dimension modelling.
We show that with these conditions, the generative functional model admits the same symmetry.
arXiv Detail & Related papers (2023-07-11T16:51:38Z) - FAENet: Frame Averaging Equivariant GNN for Materials Modeling [123.19473575281357]
We introduce a flexible framework relying on frameaveraging (SFA) to make any model E(3)-equivariant or invariant through data transformations.
We prove the validity of our method theoretically and empirically demonstrate its superior accuracy and computational scalability in materials modeling.
arXiv Detail & Related papers (2023-04-28T21:48:31Z) - Equivariant Graph Mechanics Networks with Constraints [83.38709956935095]
We propose Graph Mechanics Network (GMN) which is efficient, equivariant and constraint-aware.
GMN represents, by generalized coordinates, the forward kinematics information (positions and velocities) of a structural object.
Extensive experiments support the advantages of GMN compared to the state-of-the-art GNNs in terms of prediction accuracy, constraint satisfaction and data efficiency.
arXiv Detail & Related papers (2022-03-12T14:22:14Z) - Equivariant vector field network for many-body system modeling [65.22203086172019]
Equivariant Vector Field Network (EVFN) is built on a novel equivariant basis and the associated scalarization and vectorization layers.
We evaluate our method on predicting trajectories of simulated Newton mechanics systems with both full and partially observed data.
arXiv Detail & Related papers (2021-10-26T14:26:25Z) - Data efficiency in graph networks through equivariance [1.713291434132985]
We introduce a novel architecture for graph networks which is equivariant to any transformation in the coordinate embeddings.
We show that, learning on a minimal amount of data, the architecture we propose can perfectly generalise to unseen data in a synthetic problem.
arXiv Detail & Related papers (2021-06-25T17:42:34Z) - Self-Supervised Graph Representation Learning via Topology
Transformations [61.870882736758624]
We present the Topology Transformation Equivariant Representation learning, a general paradigm of self-supervised learning for node representations of graph data.
In experiments, we apply the proposed model to the downstream node and graph classification tasks, and results show that the proposed method outperforms the state-of-the-art unsupervised approaches.
arXiv Detail & Related papers (2021-05-25T06:11:03Z) - Beyond permutation equivariance in graph networks [1.713291434132985]
We introduce a novel architecture for graph networks which is equivariant to the Euclidean group in $n$-dimensions.
Our model is designed to work with graph networks in their most general form, thus including particular variants as special cases.
arXiv Detail & Related papers (2021-03-25T18:36:09Z) - Joint Network Topology Inference via Structured Fusion Regularization [70.30364652829164]
Joint network topology inference represents a canonical problem of learning multiple graph Laplacian matrices from heterogeneous graph signals.
We propose a general graph estimator based on a novel structured fusion regularization.
We show that the proposed graph estimator enjoys both high computational efficiency and rigorous theoretical guarantee.
arXiv Detail & Related papers (2021-03-05T04:42:32Z) - Lossless Compression of Structured Convolutional Models via Lifting [14.63152363481139]
We introduce a simple and efficient technique to detect the symmetries and compress the neural models without loss of any information.
We demonstrate through experiments that such compression can lead to significant speedups of structured convolutional models.
arXiv Detail & Related papers (2020-07-13T08:02:27Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.