Graph Geometry Interaction Learning
- URL: http://arxiv.org/abs/2010.12135v1
- Date: Fri, 23 Oct 2020 02:40:28 GMT
- Title: Graph Geometry Interaction Learning
- Authors: Shichao Zhu, Shirui Pan, Chuan Zhou, Jia Wu, Yanan Cao and Bin Wang
- Abstract summary: We develop a novel Geometry Interaction Learning (GIL) method for graphs, a well-suited and efficient alternative for learning abundant geometric properties in graph.
Our method endows each node the freedom to determine the importance of each geometry space via a flexible dual feature interaction learning and probability assembling mechanism.
Promising experimental results are presented for five benchmark datasets on node classification and link prediction tasks.
- Score: 41.10468385822182
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: While numerous approaches have been developed to embed graphs into either
Euclidean or hyperbolic spaces, they do not fully utilize the information
available in graphs, or lack the flexibility to model intrinsic complex graph
geometry. To utilize the strength of both Euclidean and hyperbolic geometries,
we develop a novel Geometry Interaction Learning (GIL) method for graphs, a
well-suited and efficient alternative for learning abundant geometric
properties in graph. GIL captures a more informative internal structural
features with low dimensions while maintaining conformal invariance of each
space. Furthermore, our method endows each node the freedom to determine the
importance of each geometry space via a flexible dual feature interaction
learning and probability assembling mechanism. Promising experimental results
are presented for five benchmark datasets on node classification and link
prediction tasks.
Related papers
- Improving embedding of graphs with missing data by soft manifolds [51.425411400683565]
The reliability of graph embeddings depends on how much the geometry of the continuous space matches the graph structure.
We introduce a new class of manifold, named soft manifold, that can solve this situation.
Using soft manifold for graph embedding, we can provide continuous spaces to pursue any task in data analysis over complex datasets.
arXiv Detail & Related papers (2023-11-29T12:48:33Z) - Alignment and Outer Shell Isotropy for Hyperbolic Graph Contrastive
Learning [69.6810940330906]
We propose a novel contrastive learning framework to learn high-quality graph embedding.
Specifically, we design the alignment metric that effectively captures the hierarchical data-invariant information.
We show that in the hyperbolic space one has to address the leaf- and height-level uniformity which are related to properties of trees.
arXiv Detail & Related papers (2023-10-27T15:31:42Z) - Improving Heterogeneous Graph Learning with Weighted Mixed-Curvature
Product Manifold [4.640835690336652]
In graph representation learning, the complex geometric structure of the input graph, e.g. hidden relations among nodes, is well captured in embedding space.
Standard Euclidean embedding spaces have a limited capacity in representing graphs of varying structures.
A promising candidate for the faithful embedding of data with varying structure is product manifold embedding spaces.
arXiv Detail & Related papers (2023-07-10T12:20:50Z) - Modeling Graphs Beyond Hyperbolic: Graph Neural Networks in Symmetric
Positive Definite Matrices [8.805129821507046]
Real-world graph data is characterized by multiple types of geometric and topological features.
We construct graph neural networks that can robustly handle complex graphs.
arXiv Detail & Related papers (2023-06-24T21:50:53Z) - Geometry Contrastive Learning on Heterogeneous Graphs [50.58523799455101]
This paper proposes a novel self-supervised learning method, termed as Geometry Contrastive Learning (GCL)
GCL views a heterogeneous graph from Euclidean and hyperbolic perspective simultaneously, aiming to make a strong merger of the ability of modeling rich semantics and complex structures.
Extensive experiments on four benchmarks data sets show that the proposed approach outperforms the strong baselines.
arXiv Detail & Related papers (2022-06-25T03:54:53Z) - Hermitian Symmetric Spaces for Graph Embeddings [0.0]
We learn continuous representations of graphs in spaces of symmetric matrices over C.
These spaces offer a rich geometry that simultaneously admits hyperbolic and Euclidean subspaces.
The proposed models are able to automatically adapt to very dissimilar arrangements without any apriori estimates of graph features.
arXiv Detail & Related papers (2021-05-11T18:14:52Z) - GraphOpt: Learning Optimization Models of Graph Formation [72.75384705298303]
We propose an end-to-end framework that learns an implicit model of graph structure formation and discovers an underlying optimization mechanism.
The learned objective can serve as an explanation for the observed graph properties, thereby lending itself to transfer across different graphs within a domain.
GraphOpt poses link formation in graphs as a sequential decision-making process and solves it using maximum entropy inverse reinforcement learning algorithm.
arXiv Detail & Related papers (2020-07-07T16:51:39Z) - Geometric Graph Representations and Geometric Graph Convolutions for
Deep Learning on Three-Dimensional (3D) Graphs [0.8722210937404288]
The geometry of three-dimensional (3D) graphs, consisting of nodes and edges, plays a crucial role in many important applications.
We define three types of geometric graph representations: positional, angle-geometric and distance-geometric.
For proof of concept, we use the distance-geometric graph representation for geometric graph convolutions.
The results of geometric graph convolutions, for the ESOL and Freesol datasets, show significant improvement over those of standard graph convolutions.
arXiv Detail & Related papers (2020-06-02T17:08:59Z) - Geometrically Principled Connections in Graph Neural Networks [66.51286736506658]
We argue geometry should remain the primary driving force behind innovation in the emerging field of geometric deep learning.
We relate graph neural networks to widely successful computer graphics and data approximation models: radial basis functions (RBFs)
We introduce affine skip connections, a novel building block formed by combining a fully connected layer with any graph convolution operator.
arXiv Detail & Related papers (2020-04-06T13:25:46Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.