LoNe Sampler: Graph node embeddings by coordinated local neighborhood
sampling
- URL: http://arxiv.org/abs/2211.15114v1
- Date: Mon, 28 Nov 2022 08:04:26 GMT
- Title: LoNe Sampler: Graph node embeddings by coordinated local neighborhood
sampling
- Authors: Konstantin Kutzkov
- Abstract summary: Local graph neighborhood sampling is a fundamental computational problem that is at the heart of algorithms for node representation learning.
We present LoNe Sampler, a suite of algorithms for generating discrete node embeddings by Local Neighborhood Sampling.
- Score: 0.7614628596146599
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Local graph neighborhood sampling is a fundamental computational problem that
is at the heart of algorithms for node representation learning. Several works
have presented algorithms for learning discrete node embeddings where graph
nodes are represented by discrete features such as attributes of neighborhood
nodes. Discrete embeddings offer several advantages compared to continuous
word2vec-like node embeddings: ease of computation, scalability, and
interpretability. We present LoNe Sampler, a suite of algorithms for generating
discrete node embeddings by Local Neighborhood Sampling, and address two
shortcomings of previous work. First, our algorithms have rigorously understood
theoretical properties. Second, we show how to generate approximate explicit
vector maps that avoid the expensive computation of a Gram matrix for the
training of a kernel model. Experiments on benchmark datasets confirm the
theoretical findings and demonstrate the advantages of the proposed methods.
Related papers
- A Differentially Private Clustering Algorithm for Well-Clustered Graphs [6.523602840064548]
We provide an efficient ($epsilon,$delta$)-DP algorithm tailored specifically for such graphs.
Our algorithm works for well-clustered graphs with $k$ nearly-balanced clusters.
arXiv Detail & Related papers (2024-03-21T11:57:16Z) - NodeFormer: A Scalable Graph Structure Learning Transformer for Node
Classification [70.51126383984555]
We introduce a novel all-pair message passing scheme for efficiently propagating node signals between arbitrary nodes.
The efficient computation is enabled by a kernerlized Gumbel-Softmax operator.
Experiments demonstrate the promising efficacy of the method in various tasks including node classification on graphs.
arXiv Detail & Related papers (2023-06-14T09:21:15Z) - Efficient Graph Field Integrators Meet Point Clouds [59.27295475120132]
We present two new classes of algorithms for efficient field integration on graphs encoding point clouds.
The first class, SeparatorFactorization(SF), leverages the bounded genus of point cloud mesh graphs, while the second class, RFDiffusion(RFD), uses popular epsilon-nearest-neighbor graph representations for point clouds.
arXiv Detail & Related papers (2023-02-02T08:33:36Z) - STERLING: Synergistic Representation Learning on Bipartite Graphs [78.86064828220613]
A fundamental challenge of bipartite graph representation learning is how to extract node embeddings.
Most recent bipartite graph SSL methods are based on contrastive learning which learns embeddings by discriminating positive and negative node pairs.
We introduce a novel synergistic representation learning model (STERLING) to learn node embeddings without negative node pairs.
arXiv Detail & Related papers (2023-01-25T03:21:42Z) - Unveiling the Sampling Density in Non-Uniform Geometric Graphs [69.93864101024639]
We consider graphs as geometric graphs: nodes are randomly sampled from an underlying metric space, and any pair of nodes is connected if their distance is less than a specified neighborhood radius.
In a social network communities can be modeled as densely sampled areas, and hubs as nodes with larger neighborhood radius.
We develop methods to estimate the unknown sampling density in a self-supervised fashion.
arXiv Detail & Related papers (2022-10-15T08:01:08Z) - Geometric Graph Representation Learning via Maximizing Rate Reduction [73.6044873825311]
Learning node representations benefits various downstream tasks in graph analysis such as community detection and node classification.
We propose Geometric Graph Representation Learning (G2R) to learn node representations in an unsupervised manner.
G2R maps nodes in distinct groups into different subspaces, while each subspace is compact and different subspaces are dispersed.
arXiv Detail & Related papers (2022-02-13T07:46:24Z) - COLOGNE: Coordinated Local Graph Neighborhood Sampling [1.6498361958317633]
replacing discrete unordered objects such as graph nodes by real-valued vectors is at the heart of many approaches to learning from graph data.
We address the problem of learning discrete node embeddings such that the coordinates of the node vector representations are graph nodes.
This opens the door to designing interpretable machine learning algorithms for graphs as all attributes originally present in the nodes are preserved.
arXiv Detail & Related papers (2021-02-09T11:39:06Z) - Overlapping community detection in networks via sparse spectral
decomposition [1.0660480034605242]
We consider the problem of estimating overlapping community memberships in a network, where each node can belong to multiple communities.
Our algorithm is based on sparse principal subspace estimation with iterative thresholding.
We show that a fixed point of the algorithm corresponds to correct node memberships under a version of the block model.
arXiv Detail & Related papers (2020-09-20T07:31:09Z) - Online Dense Subgraph Discovery via Blurred-Graph Feedback [87.9850024070244]
We introduce a novel learning problem for dense subgraph discovery.
We first propose a edge-time algorithm that obtains a nearly-optimal solution with high probability.
We then design a more scalable algorithm with a theoretical guarantee.
arXiv Detail & Related papers (2020-06-24T11:37:33Z) - Learning Representations using Spectral-Biased Random Walks on Graphs [18.369974607582584]
We study how much a probabilistic bias in this process affects the quality of the nodes picked by the process.
We succinctly capture this neighborhood as a probability measure based on the spectrum of the node's neighborhood subgraph represented as a normalized laplacian matrix.
We empirically evaluate our approach against several state-of-the-art node embedding techniques on a wide variety of real-world datasets.
arXiv Detail & Related papers (2020-05-19T20:42:43Z) - Neighborhood and Graph Constructions using Non-Negative Kernel
Regression [42.16401154367232]
We present an alternative view of neighborhood selection, where we show that neighborhood construction is equivalent to a sparse signal approximation problem.
We also propose an algorithm, non-negative kernel regression(NNK), for obtaining neighborhoods that lead to better sparse representation.
arXiv Detail & Related papers (2019-10-21T13:58:14Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.