Graph Embeddings via Tensor Products and Approximately Orthonormal Codes
- URL: http://arxiv.org/abs/2208.10917v5
- Date: Sat, 3 Jun 2023 23:35:18 GMT
- Title: Graph Embeddings via Tensor Products and Approximately Orthonormal Codes
- Authors: Frank Qiu
- Abstract summary: We show that our representation falls under the bind-and-sum approach in hyperdimensional computing.
We establish some precise results characterizing the behavior of our method.
We briefly discuss its applications toward a dynamic compressed representation of large sparse graphs.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We propose a dynamic graph representation method, showcasing its rich
representational capacity and establishing some of its theoretical properties.
Our representation falls under the bind-and-sum approach in hyperdimensional
computing (HDC), and we show that the tensor product is the most general
binding operation that respects the superposition principle employed in HDC. We
also establish some precise results characterizing the behavior of our method,
including a memory vs. size analysis of how our representation's size must
scale with the number of edges in order to retain accurate graph operations.
True to its HDC roots, we also compare our graph representation to another
typical HDC representation, the Hadamard-Rademacher scheme, showing that these
two graph representations have the same memory-capacity scaling. We establish a
link to adjacency matrices, showing that our method is a pseudo-orthogonal
generalization of adjacency matrices. In light of this, we briefly discuss its
applications toward a dynamic compressed representation of large sparse graphs.
Related papers
- Graph-Dictionary Signal Model for Sparse Representations of Multivariate Data [49.77103348208835]
We define a novel Graph-Dictionary signal model, where a finite set of graphs characterizes relationships in data distribution through a weighted sum of their Laplacians.
We propose a framework to infer the graph dictionary representation from observed data, along with a bilinear generalization of the primal-dual splitting algorithm to solve the learning problem.
We exploit graph-dictionary representations in a motor imagery decoding task on brain activity data, where we classify imagined motion better than standard methods.
arXiv Detail & Related papers (2024-11-08T17:40:43Z) - Graph Edge Representation via Tensor Product Graph Convolutional Representation [23.021660625582854]
This paper defines an effective convolution operator on graphs with edge features which is named as Product Graph Convolution (TPGC)
It provides a complementary model to traditional graph convolutions (GCs) to address the more general graph data analysis with both node and edge features.
Experimental results on several graph learning tasks demonstrate the effectiveness of the proposed TPGC.
arXiv Detail & Related papers (2024-06-21T03:21:26Z) - Encoder Embedding for General Graph and Node Classification [4.178980693837599]
We prove that the encoder embedding matrices satisfies the law of large numbers and the central limit theorem on a per-observation basis.
Under certain condition, it achieves normality on a per-class basis, enabling optimal classification through discriminant analysis.
arXiv Detail & Related papers (2024-05-24T11:51:08Z) - OrthoReg: Improving Graph-regularized MLPs via Orthogonality
Regularization [66.30021126251725]
Graph Neural Networks (GNNs) are currently dominating in modeling graphstructure data.
Graph-regularized networks (GR-MLPs) implicitly inject the graph structure information into model weights, while their performance can hardly match that of GNNs in most tasks.
We show that GR-MLPs suffer from dimensional collapse, a phenomenon in which the largest a few eigenvalues dominate the embedding space.
We propose OrthoReg, a novel GR-MLP model to mitigate the dimensional collapse issue.
arXiv Detail & Related papers (2023-01-31T21:20:48Z) - Graphon Pooling for Reducing Dimensionality of Signals and Convolutional
Operators on Graphs [131.53471236405628]
We present three methods that exploit the induced graphon representation of graphs and graph signals on partitions of [0, 1]2 in the graphon space.
We prove that those low dimensional representations constitute a convergent sequence of graphs and graph signals.
We observe that graphon pooling performs significantly better than other approaches proposed in the literature when dimensionality reduction ratios between layers are large.
arXiv Detail & Related papers (2022-12-15T22:11:34Z) - Graph Polynomial Convolution Models for Node Classification of
Non-Homophilous Graphs [52.52570805621925]
We investigate efficient learning from higher-order graph convolution and learning directly from adjacency matrix for node classification.
We show that the resulting model lead to new graphs and residual scaling parameter.
We demonstrate that the proposed methods obtain improved accuracy for node-classification of non-homophilous parameters.
arXiv Detail & Related papers (2022-09-12T04:46:55Z) - Generalized Spectral Clustering for Directed and Undirected Graphs [4.286327408435937]
We present a generalized spectral clustering framework that can address both directed and undirected graphs.
Our approach is based on the spectral relaxation of a new functional that we introduce as the generalized Dirichlet energy of a graph function.
We also propose a practical parametrization of the regularizing measure constructed from the iterated powers of the natural random walk on the graph.
arXiv Detail & Related papers (2022-03-07T09:18:42Z) - Directed Graph Embeddings in Pseudo-Riemannian Manifolds [0.0]
We show that general directed graphs can be effectively represented by an embedding model that combines three components.
We demonstrate the representational capabilities of this method by applying it to the task of link prediction.
arXiv Detail & Related papers (2021-06-16T10:31:37Z) - Embedding Graph Auto-Encoder for Graph Clustering [90.8576971748142]
Graph auto-encoder (GAE) models are based on semi-supervised graph convolution networks (GCN)
We design a specific GAE-based model for graph clustering to be consistent with the theory, namely Embedding Graph Auto-Encoder (EGAE)
EGAE consists of one encoder and dual decoders.
arXiv Detail & Related papers (2020-02-20T09:53:28Z) - Block-Approximated Exponential Random Graphs [77.4792558024487]
An important challenge in the field of exponential random graphs (ERGs) is the fitting of non-trivial ERGs on large graphs.
We propose an approximative framework to such non-trivial ERGs that result in dyadic independence (i.e., edge independent) distributions.
Our methods are scalable to sparse graphs consisting of millions of nodes.
arXiv Detail & Related papers (2020-02-14T11:42:16Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.