A Convolutional Neural Network into graph space
- URL: http://arxiv.org/abs/2002.09285v3
- Date: Mon, 2 Oct 2023 09:53:36 GMT
- Title: A Convolutional Neural Network into graph space
- Authors: Chlo\'e Martineau, Romain Raveaux, Donatello Conte, Gilles Venturini
- Abstract summary: We propose a new convolution neural network architecture, defined directly into graph space.
We show its usability in a back-propagation context.
It shows robustness with respect to graph domain changes and improvement with respect to other euclidean and non-euclidean convolutional architectures.
- Score: 5.6326241162252755
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Convolutional neural networks (CNNs), in a few decades, have outperformed the
existing state of the art methods in classification context. However, in the
way they were formalised, CNNs are bound to operate on euclidean spaces.
Indeed, convolution is a signal operation that are defined on euclidean spaces.
This has restricted deep learning main use to euclidean-defined data such as
sound or image. And yet, numerous computer application fields (among which
network analysis, computational social science, chemo-informatics or computer
graphics) induce non-euclideanly defined data such as graphs, networks or
manifolds. In this paper we propose a new convolution neural network
architecture, defined directly into graph space. Convolution and pooling
operators are defined in graph domain. We show its usability in a
back-propagation context. Experimental results show that our model performance
is at state of the art level on simple tasks. It shows robustness with respect
to graph domain changes and improvement with respect to other euclidean and
non-euclidean convolutional architectures.
Related papers
- Scalable Graph Compressed Convolutions [68.85227170390864]
We propose a differentiable method that applies permutations to calibrate input graphs for Euclidean convolution.
Based on the graph calibration, we propose the Compressed Convolution Network (CoCN) for hierarchical graph representation learning.
arXiv Detail & Related papers (2024-07-26T03:14:13Z) - Hyperbolic Convolutional Neural Networks [14.35618845900589]
Using non-Euclidean space for embedding data might result in more robust and explainable models.
We hypothesize that ability of hyperbolic space to capture hierarchy in the data would lead to better performance.
arXiv Detail & Related papers (2023-08-29T21:20:16Z) - Effects of Data Geometry in Early Deep Learning [16.967930721746672]
Deep neural networks can approximate functions on different types of data, from images to graphs, with varied underlying structure.
We study how a randomly neural network with piece-wise linear activation splits the data manifold into regions where the neural network behaves as a linear function.
arXiv Detail & Related papers (2022-12-29T17:32:05Z) - Convolutional Neural Networks on Manifolds: From Graphs and Back [122.06927400759021]
We propose a manifold neural network (MNN) composed of a bank of manifold convolutional filters and point-wise nonlinearities.
To sum up, we focus on the manifold model as the limit of large graphs and construct MNNs, while we can still bring back graph neural networks by the discretization of MNNs.
arXiv Detail & Related papers (2022-10-01T21:17:39Z) - Hyperbolic Graph Neural Networks: A Review of Methods and Applications [55.5502008501764]
Graph neural networks generalize conventional neural networks to graph-structured data.
The performance of Euclidean models in graph-related learning is still bounded and limited by the representation ability of Euclidean geometry.
Recently, hyperbolic space has gained increasing popularity in processing graph data with tree-like structure and power-law distribution.
arXiv Detail & Related papers (2022-02-28T15:08:48Z) - A singular Riemannian geometry approach to Deep Neural Networks II.
Reconstruction of 1-D equivalence classes [78.120734120667]
We build the preimage of a point in the output manifold in the input space.
We focus for simplicity on the case of neural networks maps from n-dimensional real spaces to (n - 1)-dimensional real spaces.
arXiv Detail & Related papers (2021-12-17T11:47:45Z) - Graph Kernel Neural Networks [53.91024360329517]
We propose to use graph kernels, i.e. kernel functions that compute an inner product on graphs, to extend the standard convolution operator to the graph domain.
This allows us to define an entirely structural model that does not require computing the embedding of the input graph.
Our architecture allows to plug-in any type of graph kernels and has the added benefit of providing some interpretability.
arXiv Detail & Related papers (2021-12-14T14:48:08Z) - Autobahn: Automorphism-based Graph Neural Nets [12.029647699164315]
We introduce Automorphism-based graph neural networks (Autobahn)
In an Autobahn, we decompose the graph into a collection of subgraphs and applying local convolutions that are equivariant to each subgraph's automorphism group.
We validate our approach by applying Autobahn to molecular graphs, where it achieves state-of-the-art results.
arXiv Detail & Related papers (2021-03-02T13:34:29Z) - Quaternion Graph Neural Networks [17.10479440152652]
We propose Quaternion Graph Neural Networks (QGNN) to learn graph representations within the Quaternion space.
Our QGNN obtains state-of-the-art results on a range of benchmark datasets for graph classification and node classification.
arXiv Detail & Related papers (2020-08-12T03:41:03Z) - Geometrically Principled Connections in Graph Neural Networks [66.51286736506658]
We argue geometry should remain the primary driving force behind innovation in the emerging field of geometric deep learning.
We relate graph neural networks to widely successful computer graphics and data approximation models: radial basis functions (RBFs)
We introduce affine skip connections, a novel building block formed by combining a fully connected layer with any graph convolution operator.
arXiv Detail & Related papers (2020-04-06T13:25:46Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.