Non-Euclidean Spatial Graph Neural Network
- URL: http://arxiv.org/abs/2312.10808v2
- Date: Wed, 10 Jan 2024 15:22:01 GMT
- Title: Non-Euclidean Spatial Graph Neural Network
- Authors: Zheng Zhang, Sirui Li, Jingcheng Zhou, Junxiang Wang, Abhinav
Angirekula, Allen Zhang and Liang Zhao
- Abstract summary: A novel message-passing-based neural network is proposed to combine graph topology and spatial geometry.
We theoretically guarantee that the learned representations are provably invariant to important symmetries such as rotation or translation.
- Score: 13.569970309961777
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Spatial networks are networks whose graph topology is constrained by their
embedded spatial space. Understanding the coupled spatial-graph properties is
crucial for extracting powerful representations from spatial networks.
Therefore, merely combining individual spatial and network representations
cannot reveal the underlying interaction mechanism of spatial networks.
Besides, existing spatial network representation learning methods can only
consider networks embedded in Euclidean space, and can not well exploit the
rich geometric information carried by irregular and non-uniform non-Euclidean
space. In order to address this issue, in this paper we propose a novel generic
framework to learn the representation of spatial networks that are embedded in
non-Euclidean manifold space. Specifically, a novel message-passing-based
neural network is proposed to combine graph topology and spatial geometry,
where spatial geometry is extracted as messages on the edges. We theoretically
guarantee that the learned representations are provably invariant to important
symmetries such as rotation or translation, and simultaneously maintain
sufficient ability in distinguishing different geometric structures. The
strength of our proposed method is demonstrated through extensive experiments
on both synthetic and real-world datasets.
Related papers
- Riemannian Residual Neural Networks [58.925132597945634]
We show how to extend the residual neural network (ResNet)
ResNets have become ubiquitous in machine learning due to their beneficial learning properties, excellent empirical results, and easy-to-incorporate nature when building varied neural networks.
arXiv Detail & Related papers (2023-10-16T02:12:32Z) - Low-Rank Representations Towards Classification Problem of Complex
Networks [0.0]
Complex networks representing social interactions, brain activities, molecular structures have been studied widely to be able to understand and predict their characteristics as graphs.
Models and algorithms for these networks are used in real-life applications, such as search engines, and recommender systems.
We study the performance of such low-rank representations of real-life networks on a network classification problem.
arXiv Detail & Related papers (2022-10-20T19:56:18Z) - Quasi-orthogonality and intrinsic dimensions as measures of learning and
generalisation [55.80128181112308]
We show that dimensionality and quasi-orthogonality of neural networks' feature space may jointly serve as network's performance discriminants.
Our findings suggest important relationships between the networks' final performance and properties of their randomly initialised feature spaces.
arXiv Detail & Related papers (2022-03-30T21:47:32Z) - Nested Hyperbolic Spaces for Dimensionality Reduction and Hyperbolic NN
Design [8.250374560598493]
Hyperbolic neural networks have been popular in the recent past due to their ability to represent hierarchical data sets effectively and efficiently.
The challenge in developing these networks lies in the nonlinearity of the embedding space namely, the Hyperbolic space.
We present a novel fully hyperbolic neural network which uses the concept of projections (embeddings) followed by an intrinsic aggregation and a nonlinearity all within the hyperbolic space.
arXiv Detail & Related papers (2021-12-03T03:20:27Z) - Positional Encoder Graph Neural Networks for Geographic Data [1.840220263320992]
Graph neural networks (GNNs) provide a powerful and scalable solution for modeling continuous spatial data.
In this paper, we propose PE-GNN, a new framework that incorporates spatial context and correlation explicitly into the models.
arXiv Detail & Related papers (2021-11-19T10:41:49Z) - DiffusionNet: Discretization Agnostic Learning on Surfaces [48.658589779470454]
We introduce a new approach to deep learning on 3D surfaces, based on the insight that a simple diffusion layer is highly effective for spatial communication.
The resulting networks automatically generalize across different samplings and resolutions of a surface.
We focus primarily on triangle mesh surfaces, and demonstrate state-of-the-art results for a variety of tasks including surface classification, segmentation, and non-rigid correspondence.
arXiv Detail & Related papers (2020-12-01T23:24:22Z) - A Point-Cloud Deep Learning Framework for Prediction of Fluid Flow
Fields on Irregular Geometries [62.28265459308354]
Network learns end-to-end mapping between spatial positions and CFD quantities.
Incompress laminar steady flow past a cylinder with various shapes for its cross section is considered.
Network predicts the flow fields hundreds of times faster than our conventional CFD.
arXiv Detail & Related papers (2020-10-15T12:15:02Z) - Learning Connectivity of Neural Networks from a Topological Perspective [80.35103711638548]
We propose a topological perspective to represent a network into a complete graph for analysis.
By assigning learnable parameters to the edges which reflect the magnitude of connections, the learning process can be performed in a differentiable manner.
This learning process is compatible with existing networks and owns adaptability to larger search spaces and different tasks.
arXiv Detail & Related papers (2020-08-19T04:53:31Z) - Neural Operator: Graph Kernel Network for Partial Differential Equations [57.90284928158383]
This work is to generalize neural networks so that they can learn mappings between infinite-dimensional spaces (operators)
We formulate approximation of the infinite-dimensional mapping by composing nonlinear activation functions and a class of integral operators.
Experiments confirm that the proposed graph kernel network does have the desired properties and show competitive performance compared to the state of the art solvers.
arXiv Detail & Related papers (2020-03-07T01:56:20Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.