pyGSL: A Graph Structure Learning Toolkit
- URL: http://arxiv.org/abs/2211.03583v1
- Date: Mon, 7 Nov 2022 14:23:10 GMT
- Title: pyGSL: A Graph Structure Learning Toolkit
- Authors: Max Wasserman, Gonzalo Mateos
- Abstract summary: pyGSL is a Python library that provides efficient implementations of state-of-the-art graph structure learning models.
pyGSL is written in GPU-friendly ways, allowing one to scale to much larger network tasks.
- Score: 14.000763778781547
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We introduce pyGSL, a Python library that provides efficient implementations
of state-of-the-art graph structure learning models along with diverse datasets
to evaluate them on. The implementations are written in GPU-friendly ways,
allowing one to scale to much larger network tasks. A common interface is
introduced for algorithm unrolling methods, unifying implementations of recent
state-of-the-art techniques and allowing new methods to be quickly developed by
avoiding the need to rebuild the underlying unrolling infrastructure.
Implementations of differentiable graph structure learning models are written
in PyTorch, allowing us to leverage the rich software ecosystem that exists
e.g., around logging, hyperparameter search, and GPU-communication. This also
makes it easy to incorporate these models as components in larger gradient
based learning systems where differentiable estimates of graph structure may be
useful, e.g. in latent graph learning. Diverse datasets and performance metrics
allow consistent comparisons across models in this fast growing field. The full
code repository can be found on https://github.com/maxwass/pyGSL.
Related papers
- GraphStorm: all-in-one graph machine learning framework for industry applications [75.23076561638348]
GraphStorm is an end-to-end solution for scalable graph construction, graph model training and inference.
Every component in GraphStorm can operate on graphs with billions of nodes and can scale model training and inference to different hardware without changing any code.
GraphStorm has been used and deployed for over a dozen billion-scale industry applications after its release in May 2023.
arXiv Detail & Related papers (2024-06-10T04:56:16Z) - GraphEdit: Large Language Models for Graph Structure Learning [62.618818029177355]
Graph Structure Learning (GSL) focuses on capturing intrinsic dependencies and interactions among nodes in graph-structured data.
Existing GSL methods heavily depend on explicit graph structural information as supervision signals.
We propose GraphEdit, an approach that leverages large language models (LLMs) to learn complex node relationships in graph-structured data.
arXiv Detail & Related papers (2024-02-23T08:29:42Z) - UGSL: A Unified Framework for Benchmarking Graph Structure Learning [19.936173198345053]
We propose a benchmarking strategy for graph structure learning using a unified framework.
Our framework, called Unified Graph Structure Learning (UGSL), reformulates existing models into a single model.
Our results provide a clear and concise understanding of the different methods in this area as well as their strengths and weaknesses.
arXiv Detail & Related papers (2023-08-21T14:05:21Z) - A Framework for Large Scale Synthetic Graph Dataset Generation [2.248608623448951]
This work proposes a scalable synthetic graph generation tool to scale the datasets to production-size graphs.
The tool learns a series of parametric models from proprietary datasets that can be released to researchers.
We demonstrate the generalizability of the framework across a series of datasets.
arXiv Detail & Related papers (2022-10-04T22:41:33Z) - Condensing Graphs via One-Step Gradient Matching [50.07587238142548]
We propose a one-step gradient matching scheme, which performs gradient matching for only one single step without training the network weights.
Our theoretical analysis shows this strategy can generate synthetic graphs that lead to lower classification loss on real graphs.
In particular, we are able to reduce the dataset size by 90% while approximating up to 98% of the original performance.
arXiv Detail & Related papers (2022-06-15T18:20:01Z) - Benchpress: A Scalable and Versatile Workflow for Benchmarking Structure
Learning Algorithms [1.7188280334580197]
Probabilistic graphical models are one common approach to modelling the data generating mechanism.
We present a novel Snakemake workflow called Benchpress for producing scalable, reproducible, and platform-independent benchmarks.
We demonstrate the applicability of this workflow for learning Bayesian networks in five typical data scenarios.
arXiv Detail & Related papers (2021-07-08T14:19:28Z) - DIG: A Turnkey Library for Diving into Graph Deep Learning Research [39.58666190541479]
DIG: Dive into Graphs is a research-oriented library that integrates and unified implementations of common graph deep learning algorithms for several advanced tasks.
For each direction, we provide unified implementations of data interfaces, common algorithms, and evaluation metrics.
arXiv Detail & Related papers (2021-03-23T15:05:10Z) - Graph Traversal with Tensor Functionals: A Meta-Algorithm for Scalable
Learning [29.06880988563846]
Graph Traversal via Functionals(GTTF) is a unifying meta-algorithm framework for embedding graph algorithms.
We show for a wide class of methods, our learns in an unbiased fashion and, in expectation, approximates the learning as if the specialized implementations were run directly.
arXiv Detail & Related papers (2021-02-08T16:52:52Z) - Efficient Graph Deep Learning in TensorFlow with tf_geometric [53.237754811019464]
We introduce tf_geometric, an efficient and friendly library for graph deep learning.
tf_geometric provides kernel libraries for building Graph Neural Networks (GNNs) as well as implementations of popular GNNs.
The kernel libraries consist of infrastructures for building efficient GNNs, including graph data structures, graph map-reduce framework, graph mini-batch strategy, etc.
arXiv Detail & Related papers (2021-01-27T17:16:36Z) - Iterative Deep Graph Learning for Graph Neural Networks: Better and
Robust Node Embeddings [53.58077686470096]
We propose an end-to-end graph learning framework, namely Iterative Deep Graph Learning (IDGL) for jointly and iteratively learning graph structure and graph embedding.
Our experiments show that our proposed IDGL models can consistently outperform or match the state-of-the-art baselines.
arXiv Detail & Related papers (2020-06-21T19:49:15Z) - Torch-Struct: Deep Structured Prediction Library [138.5262350501951]
We introduce Torch-Struct, a library for structured prediction.
Torch-Struct includes a broad collection of probabilistic structures accessed through a simple and flexible distribution-based API.
arXiv Detail & Related papers (2020-02-03T16:43:02Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.