GAGE: Geometry Preserving Attributed Graph Embeddings
- URL: http://arxiv.org/abs/2011.01422v2
- Date: Wed, 23 Feb 2022 15:51:49 GMT
- Title: GAGE: Geometry Preserving Attributed Graph Embeddings
- Authors: Charilaos I. Kanatsoulis, and Nicholas D. Sidiropoulos
- Abstract summary: This paper presents a novel approach for node embedding in attributed networks.
It preserves the distances of both the connections and the attributes.
An effective and lightweight algorithm is developed to tackle the learning task.
- Score: 34.25102483600248
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Node embedding is the task of extracting concise and informative
representations of certain entities that are connected in a network. Various
real-world networks include information about both node connectivity and
certain node attributes, in the form of features or time-series data. Modern
representation learning techniques employ both the connectivity and attribute
information of the nodes to produce embeddings in an unsupervised manner. In
this context, deriving embeddings that preserve the geometry of the network and
the attribute vectors would be highly desirable, as they would reflect both the
topological neighborhood structure and proximity in feature space. While this
is fairly straightforward to maintain when only observing the connectivity or
attribute information of the network, preserving the geometry of both types of
information is challenging. A novel tensor factorization approach for node
embedding in attributed networks is proposed in this paper, that preserves the
distances of both the connections and the attributes. Furthermore, an effective
and lightweight algorithm is developed to tackle the learning task and
judicious experiments with multiple state-of-the-art baselines suggest that the
proposed algorithm offers significant performance improvements in downstream
tasks.
Related papers
- DGNN: Decoupled Graph Neural Networks with Structural Consistency
between Attribute and Graph Embedding Representations [62.04558318166396]
Graph neural networks (GNNs) demonstrate a robust capability for representation learning on graphs with complex structures.
A novel GNNs framework, dubbed Decoupled Graph Neural Networks (DGNN), is introduced to obtain a more comprehensive embedding representation of nodes.
Experimental results conducted on several graph benchmark datasets verify DGNN's superiority in node classification task.
arXiv Detail & Related papers (2024-01-28T06:43:13Z) - Exact Recovery and Bregman Hard Clustering of Node-Attributed Stochastic
Block Model [0.16385815610837165]
This paper presents an information-theoretic criterion for the exact recovery of community labels.
It shows how network and attribute information can be exchanged in order to have exact recovery.
It also presents an iterative clustering algorithm that maximizes the joint likelihood.
arXiv Detail & Related papers (2023-10-30T16:46:05Z) - Network Alignment with Transferable Graph Autoencoders [79.89704126746204]
We propose a novel graph autoencoder architecture designed to extract powerful and robust node embeddings.
We prove that the generated embeddings are associated with the eigenvalues and eigenvectors of the graphs.
Our proposed framework also leverages transfer learning and data augmentation to achieve efficient network alignment at a very large scale without retraining.
arXiv Detail & Related papers (2023-10-05T02:58:29Z) - Collaborative Graph Neural Networks for Attributed Network Embedding [63.39495932900291]
Graph neural networks (GNNs) have shown prominent performance on attributed network embedding.
We propose COllaborative graph Neural Networks--CONN, a tailored GNN architecture for network embedding.
arXiv Detail & Related papers (2023-07-22T04:52:27Z) - BSAL: A Framework of Bi-component Structure and Attribute Learning for
Link Prediction [33.488229191263564]
We propose a bicomponent structural and attribute learning framework (BSAL) that is designed to adaptively leverage information from topology and feature spaces.
BSAL constructs a semantic topology via the node attributes and then gets the embeddings regarding the semantic view.
It provides a flexible and easy-to-implement solution to adaptively incorporate the information carried by the node attributes.
arXiv Detail & Related papers (2022-04-18T03:12:13Z) - Learning Asymmetric Embedding for Attributed Networks via Convolutional
Neural Network [19.611523749659355]
We propose a novel deep asymmetric attributed network embedding model based on convolutional graph neural network, called AAGCN.
The main idea is to maximally preserve the asymmetric proximity and asymmetric similarity of directed attributed networks.
We test the performance of AAGCN on three real-world networks for network reconstruction, link prediction, node classification and visualization tasks.
arXiv Detail & Related papers (2022-02-13T13:35:15Z) - Variational Co-embedding Learning for Attributed Network Clustering [30.7006907516984]
Recent works for attributed network clustering utilize graph convolution to obtain node embeddings and simultaneously perform clustering assignments on the embedding space.
We propose a variational co-embedding learning model for attributed network clustering (ANC)
ANC is composed of dual variational auto-encoders to simultaneously embed nodes and attributes.
arXiv Detail & Related papers (2021-04-15T08:11:47Z) - Anomaly Detection on Attributed Networks via Contrastive Self-Supervised
Learning [50.24174211654775]
We present a novel contrastive self-supervised learning framework for anomaly detection on attributed networks.
Our framework fully exploits the local information from network data by sampling a novel type of contrastive instance pair.
A graph neural network-based contrastive learning model is proposed to learn informative embedding from high-dimensional attributes and local structure.
arXiv Detail & Related papers (2021-02-27T03:17:20Z) - Node2Seq: Towards Trainable Convolutions in Graph Neural Networks [59.378148590027735]
We propose a graph network layer, known as Node2Seq, to learn node embeddings with explicitly trainable weights for different neighboring nodes.
For a target node, our method sorts its neighboring nodes via attention mechanism and then employs 1D convolutional neural networks (CNNs) to enable explicit weights for information aggregation.
In addition, we propose to incorporate non-local information for feature learning in an adaptive manner based on the attention scores.
arXiv Detail & Related papers (2021-01-06T03:05:37Z) - Adversarial Context Aware Network Embeddings for Textual Networks [8.680676599607123]
Existing approaches learn embeddings of text and network structure by enforcing embeddings of connected nodes to be similar.
This implies that these approaches require edge information for learning embeddings and they cannot learn embeddings of unseen nodes.
We propose an approach that achieves both modality fusion and the capability to learn embeddings of unseen nodes.
arXiv Detail & Related papers (2020-11-05T05:20:01Z) - Graph Prototypical Networks for Few-shot Learning on Attributed Networks [72.31180045017835]
We propose a graph meta-learning framework -- Graph Prototypical Networks (GPN)
GPN is able to perform textitmeta-learning on an attributed network and derive a highly generalizable model for handling the target classification task.
arXiv Detail & Related papers (2020-06-23T04:13:23Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.