Implementing graph neural networks with TensorFlow-Keras
- URL: http://arxiv.org/abs/2103.04318v1
- Date: Sun, 7 Mar 2021 10:46:02 GMT
- Title: Implementing graph neural networks with TensorFlow-Keras
- Authors: Patrick Reiser, Andre Eberhard and Pascal Friederich
- Abstract summary: Graph neural networks are a versatile machine learning architecture that received a lot of attention recently.
In this technical report, we present an implementation of convolution and pooling layers for Keras-Keras models.
- Score: 1.6114012813668934
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Graph neural networks are a versatile machine learning architecture that
received a lot of attention recently. In this technical report, we present an
implementation of convolution and pooling layers for TensorFlow-Keras models,
which allows a seamless and flexible integration into standard Keras layers to
set up graph models in a functional way. This implies the usage of mini-batches
as the first tensor dimension, which can be realized via the new RaggedTensor
class of TensorFlow best suited for graphs. We developed the Keras Graph
Convolutional Neural Network Python package kgcnn based on TensorFlow-Keras
that provides a set of Keras layers for graph networks which focus on a
transparent tensor structure passed between layers and an ease-of-use mindset.
Related papers
- GNN-LoFI: a Novel Graph Neural Network through Localized Feature-based
Histogram Intersection [51.608147732998994]
Graph neural networks are increasingly becoming the framework of choice for graph-based machine learning.
We propose a new graph neural network architecture that substitutes classical message passing with an analysis of the local distribution of node features.
arXiv Detail & Related papers (2024-01-17T13:04:23Z) - Dynamic Graph Message Passing Networks for Visual Recognition [112.49513303433606]
Modelling long-range dependencies is critical for scene understanding tasks in computer vision.
A fully-connected graph is beneficial for such modelling, but its computational overhead is prohibitive.
We propose a dynamic graph message passing network, that significantly reduces the computational complexity.
arXiv Detail & Related papers (2022-09-20T14:41:37Z) - Neural Graph Matching for Pre-training Graph Neural Networks [72.32801428070749]
Graph neural networks (GNNs) have been shown powerful capacity at modeling structural data.
We present a novel Graph Matching based GNN Pre-Training framework, called GMPT.
The proposed method can be applied to fully self-supervised pre-training and coarse-grained supervised pre-training.
arXiv Detail & Related papers (2022-03-03T09:53:53Z) - Graph Kernel Neural Networks [53.91024360329517]
We propose to use graph kernels, i.e. kernel functions that compute an inner product on graphs, to extend the standard convolution operator to the graph domain.
This allows us to define an entirely structural model that does not require computing the embedding of the input graph.
Our architecture allows to plug-in any type of graph kernels and has the added benefit of providing some interpretability.
arXiv Detail & Related papers (2021-12-14T14:48:08Z) - Improving Graph Neural Networks with Simple Architecture Design [7.057970273958933]
We introduce several key design strategies for graph neural networks.
We present a simple and shallow model, Feature Selection Graph Neural Network (FSGNN)
We show that the proposed model outperforms other state of the art GNN models and achieves up to 64% improvements in accuracy on node classification tasks.
arXiv Detail & Related papers (2021-05-17T06:46:01Z) - Efficient Graph Deep Learning in TensorFlow with tf_geometric [53.237754811019464]
We introduce tf_geometric, an efficient and friendly library for graph deep learning.
tf_geometric provides kernel libraries for building Graph Neural Networks (GNNs) as well as implementations of popular GNNs.
The kernel libraries consist of infrastructures for building efficient GNNs, including graph data structures, graph map-reduce framework, graph mini-batch strategy, etc.
arXiv Detail & Related papers (2021-01-27T17:16:36Z) - TensorBNN: Bayesian Inference for Neural Networks using Tensorflow [0.0]
BNN is a new package based on that implements Bayesian inference for neural network models.
The posterior density of neural network model parameters is represented as a point cloud sampled using Hamiltonian Monte Carlo.
arXiv Detail & Related papers (2020-09-30T02:20:53Z) - Graph Neural Networks in TensorFlow and Keras with Spektral [18.493394650508044]
Spektral is an open-source Python library for building graph neural networks.
It implements a large set of methods for deep learning on graphs, including message-passing and pooling operators.
It is suitable for absolute beginners and expert deep learning practitioners alike.
arXiv Detail & Related papers (2020-06-22T10:56:22Z) - Progressive Graph Convolutional Networks for Semi-Supervised Node
Classification [97.14064057840089]
Graph convolutional networks have been successful in addressing graph-based tasks such as semi-supervised node classification.
We propose a method to automatically build compact and task-specific graph convolutional networks.
arXiv Detail & Related papers (2020-03-27T08:32:16Z) - Convolutional Kernel Networks for Graph-Structured Data [37.13712126432493]
We introduce a family of multilayer graph kernels and establish new links between graph convolutional neural networks and kernel methods.
Our approach generalizes convolutional kernel networks to graph-structured data, by representing graphs as a sequence of kernel feature maps.
Our model can also be trained end-to-end on large-scale data, leading to new types of graph convolutional neural networks.
arXiv Detail & Related papers (2020-03-11T09:44:03Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.