Graph Traversal with Tensor Functionals: A Meta-Algorithm for Scalable
Learning
- URL: http://arxiv.org/abs/2102.04350v1
- Date: Mon, 8 Feb 2021 16:52:52 GMT
- Title: Graph Traversal with Tensor Functionals: A Meta-Algorithm for Scalable
Learning
- Authors: Elan Markowitz, Keshav Balasubramanian, Mehrnoosh Mirtaheri, Sami
Abu-El-Haija, Bryan Perozzi, Greg Ver Steeg, Aram Galstyan
- Abstract summary: Graph Traversal via Functionals(GTTF) is a unifying meta-algorithm framework for embedding graph algorithms.
We show for a wide class of methods, our learns in an unbiased fashion and, in expectation, approximates the learning as if the specialized implementations were run directly.
- Score: 29.06880988563846
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Graph Representation Learning (GRL) methods have impacted fields from
chemistry to social science. However, their algorithmic implementations are
specialized to specific use-cases e.g.message passing methods are run
differently from node embedding ones. Despite their apparent differences, all
these methods utilize the graph structure, and therefore, their learning can be
approximated with stochastic graph traversals. We propose Graph Traversal via
Tensor Functionals(GTTF), a unifying meta-algorithm framework for easing the
implementation of diverse graph algorithms and enabling transparent and
efficient scaling to large graphs. GTTF is founded upon a data structure
(stored as a sparse tensor) and a stochastic graph traversal algorithm
(described using tensor operations). The algorithm is a functional that accept
two functions, and can be specialized to obtain a variety of GRL models and
objectives, simply by changing those two functions. We show for a wide class of
methods, our algorithm learns in an unbiased fashion and, in expectation,
approximates the learning as if the specialized implementations were run
directly. With these capabilities, we scale otherwise non-scalable methods to
set state-of-the-art on large graph datasets while being more efficient than
existing GRL libraries - with only a handful of lines of code for each method
specialization. GTTF and its various GRL implementations are on:
https://github.com/isi-usc-edu/gttf.
Related papers
- GSINA: Improving Subgraph Extraction for Graph Invariant Learning via
Graph Sinkhorn Attention [52.67633391931959]
Graph invariant learning (GIL) has been an effective approach to discovering the invariant relationships between graph data and its labels.
We propose a novel graph attention mechanism called Graph Sinkhorn Attention (GSINA)
GSINA is able to obtain meaningful, differentiable invariant subgraphs with controllable sparsity and softness.
arXiv Detail & Related papers (2024-02-11T12:57:16Z) - MGNet: Learning Correspondences via Multiple Graphs [78.0117352211091]
Learning correspondences aims to find correct correspondences from the initial correspondence set with an uneven correspondence distribution and a low inlier rate.
Recent advances usually use graph neural networks (GNNs) to build a single type of graph or stack local graphs into the global one to complete the task.
We propose MGNet to effectively combine multiple complementary graphs.
arXiv Detail & Related papers (2024-01-10T07:58:44Z) - pyGSL: A Graph Structure Learning Toolkit [14.000763778781547]
pyGSL is a Python library that provides efficient implementations of state-of-the-art graph structure learning models.
pyGSL is written in GPU-friendly ways, allowing one to scale to much larger network tasks.
arXiv Detail & Related papers (2022-11-07T14:23:10Z) - Neighbor2Seq: Deep Learning on Massive Graphs by Transforming Neighbors
to Sequences [55.329402218608365]
We propose the Neighbor2Seq to transform the hierarchical neighborhood of each node into a sequence.
We evaluate our method on a massive graph with more than 111 million nodes and 1.6 billion edges.
Results show that our proposed method is scalable to massive graphs and achieves superior performance across massive and medium-scale graphs.
arXiv Detail & Related papers (2022-02-07T16:38:36Z) - Learnable Graph Matching: Incorporating Graph Partitioning with Deep
Feature Learning for Multiple Object Tracking [58.30147362745852]
Data association across frames is at the core of Multiple Object Tracking (MOT) task.
Existing methods mostly ignore the context information among tracklets and intra-frame detections.
We propose a novel learnable graph matching method to address these issues.
arXiv Detail & Related papers (2021-03-30T08:58:45Z) - DIG: A Turnkey Library for Diving into Graph Deep Learning Research [39.58666190541479]
DIG: Dive into Graphs is a research-oriented library that integrates and unified implementations of common graph deep learning algorithms for several advanced tasks.
For each direction, we provide unified implementations of data interfaces, common algorithms, and evaluation metrics.
arXiv Detail & Related papers (2021-03-23T15:05:10Z) - Efficient Graph Deep Learning in TensorFlow with tf_geometric [53.237754811019464]
We introduce tf_geometric, an efficient and friendly library for graph deep learning.
tf_geometric provides kernel libraries for building Graph Neural Networks (GNNs) as well as implementations of popular GNNs.
The kernel libraries consist of infrastructures for building efficient GNNs, including graph data structures, graph map-reduce framework, graph mini-batch strategy, etc.
arXiv Detail & Related papers (2021-01-27T17:16:36Z) - Training Sensitivity in Graph Isomorphism Network [2.487445341407889]
Graph neural network (GNN) is a popular tool to learn the lower-dimensional representation of a graph.
This paper studies various alternative functions for a respective module using a diverse set of benchmark datasets.
arXiv Detail & Related papers (2020-08-19T03:50:28Z) - Graph Pooling with Node Proximity for Hierarchical Representation
Learning [80.62181998314547]
We propose a novel graph pooling strategy that leverages node proximity to improve the hierarchical representation learning of graph data with their multi-hop topology.
Results show that the proposed graph pooling strategy is able to achieve state-of-the-art performance on a collection of public graph classification benchmark datasets.
arXiv Detail & Related papers (2020-06-19T13:09:44Z) - Universal Function Approximation on Graphs [0.0]
We produce a framework for constructing universal function approximators on graph isomorphism classes.
We show how this allows us to achieve state-of-the-art performance on four different well-known datasets.
arXiv Detail & Related papers (2020-03-14T21:12:33Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.