Enhancing Hierarchical Information by Using Metric Cones for Graph
Embedding
- URL: http://arxiv.org/abs/2102.08014v1
- Date: Tue, 16 Feb 2021 08:23:59 GMT
- Title: Enhancing Hierarchical Information by Using Metric Cones for Graph
Embedding
- Authors: Daisuke Takehara, Kei Kobayashi
- Abstract summary: Poincar'e embedding has been proposed to capture the hierarchical structure of graphs.
Most of the existing methods have isometric mappings in the embedding space.
We propose graph embedding in a metric cone to solve such a problem.
- Score: 3.700709497727248
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Graph embedding is becoming an important method with applications in various
areas, including social networks and knowledge graph completion. In particular,
Poincar\'e embedding has been proposed to capture the hierarchical structure of
graphs, and its effectiveness has been reported. However, most of the existing
methods have isometric mappings in the embedding space, and the choice of the
origin point can be arbitrary. This fact is not desirable when the distance
from the origin is used as an indicator of hierarchy, as in the case of
Poincar\'e embedding. In this paper, we propose graph embedding in a metric
cone to solve such a problem, and we gain further benefits: 1) we provide an
indicator of hierarchical information that is both geometrically and
intuitively natural to interpret, 2) we can extract the hierarchical structure
from a graph embedding output of other methods by learning additional
one-dimensional parameters, and 3) we can change the curvature of the embedding
space via a hyperparameter.
Related papers
- Deep Manifold Graph Auto-Encoder for Attributed Graph Embedding [51.75091298017941]
This paper proposes a novel Deep Manifold (Variational) Graph Auto-Encoder (DMVGAE/DMGAE) for attributed graph data.
The proposed method surpasses state-of-the-art baseline algorithms by a significant margin on different downstream tasks across popular datasets.
arXiv Detail & Related papers (2024-01-12T17:57:07Z) - Improving embedding of graphs with missing data by soft manifolds [51.425411400683565]
The reliability of graph embeddings depends on how much the geometry of the continuous space matches the graph structure.
We introduce a new class of manifold, named soft manifold, that can solve this situation.
Using soft manifold for graph embedding, we can provide continuous spaces to pursue any task in data analysis over complex datasets.
arXiv Detail & Related papers (2023-11-29T12:48:33Z) - Bures-Wasserstein Means of Graphs [60.42414991820453]
We propose a novel framework for defining a graph mean via embeddings in the space of smooth graph signal distributions.
By finding a mean in this embedding space, we can recover a mean graph that preserves structural information.
We establish the existence and uniqueness of the novel graph mean, and provide an iterative algorithm for computing it.
arXiv Detail & Related papers (2023-05-31T11:04:53Z) - Hyperbolic Diffusion Embedding and Distance for Hierarchical
Representation Learning [13.976918651426205]
This paper presents a new method for hierarchical data embedding and distance.
Our method relies on combining diffusion geometry, a central approach to manifold learning, and hyperbolic geometry.
We show theoretically that our embedding and distance recover the underlying hierarchical structure.
arXiv Detail & Related papers (2023-05-30T11:49:39Z) - Semantic Random Walk for Graph Representation Learning in Attributed
Graphs [2.318473106845779]
We propose a novel semantic graph representation (SGR) method to formulate the joint optimization of the two heterogeneous sources into a common high-order proximity based framework.
Conventional embedding methods that consider high-order topology proximities can then be easily applied to the newly constructed graph to learn the representations of both node and attribute.
The learned attribute embeddings can also effectively support some semantic-oriented inference tasks, helping to reveal the graph's deep semantic.
arXiv Detail & Related papers (2023-05-11T02:35:16Z) - You Only Transfer What You Share: Intersection-Induced Graph Transfer
Learning for Link Prediction [79.15394378571132]
We investigate a previously overlooked phenomenon: in many cases, a densely connected, complementary graph can be found for the original graph.
The denser graph may share nodes with the original graph, which offers a natural bridge for transferring selective, meaningful knowledge.
We identify this setting as Graph Intersection-induced Transfer Learning (GITL), which is motivated by practical applications in e-commerce or academic co-authorship predictions.
arXiv Detail & Related papers (2023-02-27T22:56:06Z) - GrannGAN: Graph annotation generative adversarial networks [72.66289932625742]
We consider the problem of modelling high-dimensional distributions and generating new examples of data with complex relational feature structure coherent with a graph skeleton.
The model we propose tackles the problem of generating the data features constrained by the specific graph structure of each data point by splitting the task into two phases.
In the first it models the distribution of features associated with the nodes of the given graph, in the second it complements the edge features conditionally on the node features.
arXiv Detail & Related papers (2022-12-01T11:49:07Z) - SHGNN: Structure-Aware Heterogeneous Graph Neural Network [77.78459918119536]
This paper proposes a novel Structure-Aware Heterogeneous Graph Neural Network (SHGNN) to address the above limitations.
We first utilize a feature propagation module to capture the local structure information of intermediate nodes in the meta-path.
Next, we use a tree-attention aggregator to incorporate the graph structure information into the aggregation module on the meta-path.
Finally, we leverage a meta-path aggregator to fuse the information aggregated from different meta-paths.
arXiv Detail & Related papers (2021-12-12T14:18:18Z) - Graph Transplant: Node Saliency-Guided Graph Mixup with Local Structure
Preservation [27.215800308343322]
We present the first Mixup-like graph augmentation method at the graph-level called Graph Transplant.
Our method identifies the sub-structure as a mix unit that can preserve the local information.
We extensively validate our method with diverse GNN architectures on multiple graph classification benchmark datasets.
arXiv Detail & Related papers (2021-11-10T11:10:13Z) - Understanding graph embedding methods and their applications [1.14219428942199]
Graph embedding techniques can be effective in converting high-dimensional sparse graphs into low-dimensional, dense and continuous vector spaces.
The generated nonlinear and highly informative graph embeddings in the latent space can be conveniently used to address different downstream graph analytics tasks.
arXiv Detail & Related papers (2020-12-15T00:30:22Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.