Neural Graph Databases
- URL: http://arxiv.org/abs/2209.09732v1
- Date: Tue, 20 Sep 2022 14:06:44 GMT
- Title: Neural Graph Databases
- Authors: Maciej Besta, Patrick Iff, Florian Scheidl, Kazuki Osawa, Nikoli
Dryden, Michal Podstawski, Tiancheng Chen, Torsten Hoefler
- Abstract summary: We show how to seamlessly combine nearly any graph neural network (GNN) model with the computational capabilities of graph databases (GDBs)
We develop LPG2vec, an encoder that transforms an arbitrary LPG dataset into a representation that can be directly used with a broad class of GNNs.
In our evaluation, we show that the rich information represented as LPG labels and properties is properly preserved by LPG2vec.
- Score: 14.711031610812967
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Graph databases (GDBs) enable processing and analysis of unstructured,
complex, rich, and usually vast graph datasets. Despite the large significance
of GDBs in both academia and industry, little effort has been made into
integrating them with the predictive power of graph neural networks (GNNs). In
this work, we show how to seamlessly combine nearly any GNN model with the
computational capabilities of GDBs. For this, we observe that the majority of
these systems are based on, or support, a graph data model called the Labeled
Property Graph (LPG), where vertices and edges can have arbitrarily complex
sets of labels and properties. We then develop LPG2vec, an encoder that
transforms an arbitrary LPG dataset into a representation that can be directly
used with a broad class of GNNs, including convolutional, attentional,
message-passing, and even higher-order or spectral models. In our evaluation,
we show that the rich information represented as LPG labels and properties is
properly preserved by LPG2vec, and it increases the accuracy of predictions
regardless of the targeted learning task or the used GNN model, by up to 34%
compared to graphs with no LPG labels/properties. In general, LPG2vec enables
combining predictive power of the most powerful GNNs with the full scope of
information encoded in the LPG model, paving the way for neural graph
databases, a class of systems where the vast complexity of maintained data will
benefit from modern and future graph machine learning methods.
Related papers
- How do Probabilistic Graphical Models and Graph Neural Networks Look at Network Data? [0.0]
We compare Probabilistic Graphical Models (PGMs) and Graph Neural Networks (GNNs)<n>We find that PGMs are more robust than GNNs when the heterophily of the graph is increased.
arXiv Detail & Related papers (2025-06-13T15:19:28Z) - GLISP: A Scalable GNN Learning System by Exploiting Inherent Structural
Properties of Graphs [5.410321469222541]
We propose GLISP, a sampling based GNN learning system for industrial scale graphs.
GLISP consists of three core components: graph partitioner, graph sampling service and graph inference engine.
Experiments show that GLISP achieves up to $6.53times$ and $70.77times$ speedups over existing GNN systems for training and inference tasks.
arXiv Detail & Related papers (2024-01-06T02:59:24Z) - Global Minima, Recoverability Thresholds, and Higher-Order Structure in
GNNS [0.0]
We analyze the performance of graph neural network (GNN) architectures from the perspective of random graph theory.
We show how both specific higher-order structures in synthetic data and the mix of empirical structures in real data have dramatic effects on GNN performance.
arXiv Detail & Related papers (2023-10-11T17:16:33Z) - Examining the Effects of Degree Distribution and Homophily in Graph
Learning Models [19.060710813929354]
GraphWorld is a solution which generates diverse populations of synthetic graphs for benchmarking any GNN task.
Despite its success, the SBM imposed fundamental limitations on the kinds of graph structure GraphWorld could create.
In this work we examine how two additional synthetic graph generators can improve GraphWorld's evaluation.
arXiv Detail & Related papers (2023-07-17T22:35:46Z) - IGB: Addressing The Gaps In Labeling, Features, Heterogeneity, and Size
of Public Graph Datasets for Deep Learning Research [14.191338008898963]
Graph neural networks (GNNs) have shown high potential for a variety of real-world, challenging applications.
One of the major obstacles in GNN research is the lack of large-scale flexible datasets.
We introduce the Illinois Graph Benchmark (IGB), a research dataset tool that the developers can use to train, scrutinize and evaluate GNN models.
arXiv Detail & Related papers (2023-02-27T05:21:35Z) - MentorGNN: Deriving Curriculum for Pre-Training GNNs [61.97574489259085]
We propose an end-to-end model named MentorGNN that aims to supervise the pre-training process of GNNs across graphs.
We shed new light on the problem of domain adaption on relational data (i.e., graphs) by deriving a natural and interpretable upper bound on the generalization error of the pre-trained GNNs.
arXiv Detail & Related papers (2022-08-21T15:12:08Z) - Graph Generative Model for Benchmarking Graph Neural Networks [73.11514658000547]
We introduce a novel graph generative model that learns and reproduces the distribution of real-world graphs in a privacy-controlled way.
Our model can successfully generate privacy-controlled, synthetic substitutes of large-scale real-world graphs that can be effectively used to benchmark GNN models.
arXiv Detail & Related papers (2022-07-10T06:42:02Z) - A Unified Lottery Ticket Hypothesis for Graph Neural Networks [82.31087406264437]
We present a unified GNN sparsification (UGS) framework that simultaneously prunes the graph adjacency matrix and the model weights.
We further generalize the popular lottery ticket hypothesis to GNNs for the first time, by defining a graph lottery ticket (GLT) as a pair of core sub-dataset and sparse sub-network.
arXiv Detail & Related papers (2021-02-12T21:52:43Z) - GPT-GNN: Generative Pre-Training of Graph Neural Networks [93.35945182085948]
Graph neural networks (GNNs) have been demonstrated to be powerful in modeling graph-structured data.
We present the GPT-GNN framework to initialize GNNs by generative pre-training.
We show that GPT-GNN significantly outperforms state-of-the-art GNN models without pre-training by up to 9.1% across various downstream tasks.
arXiv Detail & Related papers (2020-06-27T20:12:33Z) - Eigen-GNN: A Graph Structure Preserving Plug-in for GNNs [95.63153473559865]
Graph Neural Networks (GNNs) are emerging machine learning models on graphs.
Most existing GNN models in practice are shallow and essentially feature-centric.
We show empirically and analytically that the existing shallow GNNs cannot preserve graph structures well.
We propose Eigen-GNN, a plug-in module to boost GNNs ability in preserving graph structures.
arXiv Detail & Related papers (2020-06-08T02:47:38Z) - Graphs, Convolutions, and Neural Networks: From Graph Filters to Graph
Neural Networks [183.97265247061847]
We leverage graph signal processing to characterize the representation space of graph neural networks (GNNs)
We discuss the role of graph convolutional filters in GNNs and show that any architecture built with such filters has the fundamental properties of permutation equivariance and stability to changes in the topology.
We also study the use of GNNs in recommender systems and learning decentralized controllers for robot swarms.
arXiv Detail & Related papers (2020-03-08T13:02:15Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.