Random Projections of Sparse Adjacency Matrices
- URL: http://arxiv.org/abs/2309.01360v1
- Date: Mon, 4 Sep 2023 04:53:52 GMT
- Title: Random Projections of Sparse Adjacency Matrices
- Authors: Frank Qiu
- Abstract summary: We show that our random projections retain the functionality of their underlying adjacency matrices while having extra properties that make them attractive as dynamic graph representations.
We conclude by characterizing our random projection as a distance-preserving map of adjacency matrices analogous to the usual Johnson-Lindenstrauss map.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We analyze a random projection method for adjacency matrices, studying its
utility in representing sparse graphs. We show that these random projections
retain the functionality of their underlying adjacency matrices while having
extra properties that make them attractive as dynamic graph representations. In
particular, they can represent graphs of different sizes and vertex sets in the
same space, allowing for the aggregation and manipulation of graphs in a
unified manner. We also provide results on how the size of the projections need
to scale in order to preserve accurate graph operations, showing that the size
of the projections can scale linearly with the number of vertices while
accurately retaining first-order graph information. We conclude by
characterizing our random projection as a distance-preserving map of adjacency
matrices analogous to the usual Johnson-Lindenstrauss map.
Related papers
- Discrete Graph Auto-Encoder [52.50288418639075]
We introduce a new framework named Discrete Graph Auto-Encoder (DGAE)
We first use a permutation-equivariant auto-encoder to convert graphs into sets of discrete latent node representations.
In the second step, we sort the sets of discrete latent representations and learn their distribution with a specifically designed auto-regressive model.
arXiv Detail & Related papers (2023-06-13T12:40:39Z) - Creating generalizable downstream graph models with random projections [22.690120515637854]
We investigate graph representation learning approaches that enable models to generalize across graphs.
We show that using random projections to estimate multiple powers of the transition matrix allows us to build a set of isomorphism-invariant features.
The resulting features can be used to recover enough information about the local neighborhood of a node to enable inference with relevance competitive to other approaches.
arXiv Detail & Related papers (2023-02-17T14:27:00Z) - Spectral Augmentations for Graph Contrastive Learning [50.149996923976836]
Contrastive learning has emerged as a premier method for learning representations with or without supervision.
Recent studies have shown its utility in graph representation learning for pre-training.
We propose a set of well-motivated graph transformation operations to provide a bank of candidates when constructing augmentations for a graph contrastive objective.
arXiv Detail & Related papers (2023-02-06T16:26:29Z) - GrannGAN: Graph annotation generative adversarial networks [72.66289932625742]
We consider the problem of modelling high-dimensional distributions and generating new examples of data with complex relational feature structure coherent with a graph skeleton.
The model we propose tackles the problem of generating the data features constrained by the specific graph structure of each data point by splitting the task into two phases.
In the first it models the distribution of features associated with the nodes of the given graph, in the second it complements the edge features conditionally on the node features.
arXiv Detail & Related papers (2022-12-01T11:49:07Z) - Graph Embeddings via Tensor Products and Approximately Orthonormal Codes [0.0]
We show that our representation falls under the bind-and-sum approach in hyperdimensional computing.
We establish some precise results characterizing the behavior of our method.
We briefly discuss its applications toward a dynamic compressed representation of large sparse graphs.
arXiv Detail & Related papers (2022-08-18T10:56:37Z) - Similarity-aware Positive Instance Sampling for Graph Contrastive
Pre-training [82.68805025636165]
We propose to select positive graph instances directly from existing graphs in the training set.
Our selection is based on certain domain-specific pair-wise similarity measurements.
Besides, we develop an adaptive node-level pre-training method to dynamically mask nodes to distribute them evenly in the graph.
arXiv Detail & Related papers (2022-06-23T20:12:51Z) - Embedding Graphs on Grassmann Manifold [31.42901131602713]
This paper develops a new graph representation learning scheme, namely EGG, which embeds approximated second-order graph characteristics into a Grassmann manifold.
The effectiveness of EGG is demonstrated using both clustering and classification tasks at the node level and graph level.
arXiv Detail & Related papers (2022-05-30T12:56:24Z) - 3D Shape Registration Using Spectral Graph Embedding and Probabilistic
Matching [24.41451985857662]
We address the problem of 3D shape registration and we propose a novel technique based on spectral graph theory and probabilistic matching.
The main contribution of this chapter is to extend the spectral graph matching methods to very large graphs by combining spectral graph matching with Laplacian embedding.
arXiv Detail & Related papers (2021-06-21T15:02:31Z) - Scaling Graph Clustering with Distributed Sketches [1.1011268090482575]
We present a method inspired by spectral clustering where we instead use matrix sketches derived from random dimension-reducing projections.
We show that our method produces embeddings that yield performant clustering results given a fully-dynamic block model stream.
We also discuss the effects of block model parameters upon the required dimensionality of the subsequent embeddings, and show how random projections could significantly improve the performance of graph clustering in distributed memory.
arXiv Detail & Related papers (2020-07-24T17:38:04Z) - Wasserstein-based Graph Alignment [56.84964475441094]
We cast a new formulation for the one-to-many graph alignment problem, which aims at matching a node in the smaller graph with one or more nodes in the larger graph.
We show that our method leads to significant improvements with respect to the state-of-the-art algorithms for each of these tasks.
arXiv Detail & Related papers (2020-03-12T22:31:59Z) - Block-Approximated Exponential Random Graphs [77.4792558024487]
An important challenge in the field of exponential random graphs (ERGs) is the fitting of non-trivial ERGs on large graphs.
We propose an approximative framework to such non-trivial ERGs that result in dyadic independence (i.e., edge independent) distributions.
Our methods are scalable to sparse graphs consisting of millions of nodes.
arXiv Detail & Related papers (2020-02-14T11:42:16Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.