Graph-Time Convolutional Neural Networks
- URL: http://arxiv.org/abs/2103.01730v1
- Date: Tue, 2 Mar 2021 14:03:44 GMT
- Title: Graph-Time Convolutional Neural Networks
- Authors: Elvin Isufi and Gabriele Mazzola
- Abstract summary: We represent spatial relationships through product graphs with a first principle graph-time convolutional neural network (GTCNN)
We develop a graph-time convolutional filter by following the shift-and-sumtemporal operator to learn higher-level features over the product graph.
We develop a zero-pad pooling that preserves the spatial graph while reducing the number of active nodes and the parameters.
- Score: 9.137554315375919
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Spatiotemporal data can be represented as a process over a graph, which
captures their spatial relationships either explicitly or implicitly. How to
leverage such a structure for learning representations is one of the key
challenges when working with graphs. In this paper, we represent the
spatiotemporal relationships through product graphs and develop a first
principle graph-time convolutional neural network (GTCNN). The GTCNN is a
compositional architecture with each layer comprising a graph-time
convolutional module, a graph-time pooling module, and a nonlinearity. We
develop a graph-time convolutional filter by following the shift-and-sum
principles of the convolutional operator to learn higher-level features over
the product graph. The product graph itself is parametric so that we can learn
also the spatiotemporal coupling from data. We develop a zero-pad pooling that
preserves the spatial graph (the prior about the data) while reducing the
number of active nodes and the parameters. Experimental results with synthetic
and real data corroborate the different components and compare with baseline
and state-of-the-art solutions.
Related papers
- Structure-free Graph Condensation: From Large-scale Graphs to Condensed
Graph-free Data [91.27527985415007]
Existing graph condensation methods rely on the joint optimization of nodes and structures in the condensed graph.
We advocate a new Structure-Free Graph Condensation paradigm, named SFGC, to distill a large-scale graph into a small-scale graph node set.
arXiv Detail & Related papers (2023-06-05T07:53:52Z) - Learning Graph Structure from Convolutional Mixtures [119.45320143101381]
We propose a graph convolutional relationship between the observed and latent graphs, and formulate the graph learning task as a network inverse (deconvolution) problem.
In lieu of eigendecomposition-based spectral methods, we unroll and truncate proximal gradient iterations to arrive at a parameterized neural network architecture that we call a Graph Deconvolution Network (GDN)
GDNs can learn a distribution of graphs in a supervised fashion, perform link prediction or edge-weight regression tasks by adapting the loss function, and they are inherently inductive.
arXiv Detail & Related papers (2022-05-19T14:08:15Z) - Graph Kernel Neural Networks [53.91024360329517]
We propose to use graph kernels, i.e. kernel functions that compute an inner product on graphs, to extend the standard convolution operator to the graph domain.
This allows us to define an entirely structural model that does not require computing the embedding of the input graph.
Our architecture allows to plug-in any type of graph kernels and has the added benefit of providing some interpretability.
arXiv Detail & Related papers (2021-12-14T14:48:08Z) - GraphSVX: Shapley Value Explanations for Graph Neural Networks [81.83769974301995]
Graph Neural Networks (GNNs) achieve significant performance for various learning tasks on geometric data.
In this paper, we propose a unified framework satisfied by most existing GNN explainers.
We introduce GraphSVX, a post hoc local model-agnostic explanation method specifically designed for GNNs.
arXiv Detail & Related papers (2021-04-18T10:40:37Z) - Pyramidal Reservoir Graph Neural Network [18.632681846787246]
We propose a deep Graph Neural Network (GNN) model that alternates two types of layers.
We show how graph pooling can reduce the computational complexity of the model.
Our proposed approach to the design of RC-based GNNs offers an advantageous and principled trade-off between accuracy and complexity.
arXiv Detail & Related papers (2021-04-10T08:34:09Z) - Multivariate Time Series Classification with Hierarchical Variational
Graph Pooling [23.66868187446734]
Existing deep learning-based MTSC techniques are primarily concerned with the temporal dependency of single time series.
We propose a novel graph pooling-based framework MTPool to obtain the expressive global representation of MTS.
Experiments on ten benchmark datasets exhibit MTPool outperforms state-of-the-art strategies in the MTSC task.
arXiv Detail & Related papers (2020-10-12T12:36:47Z) - Kernel-based Graph Learning from Smooth Signals: A Functional Viewpoint [15.577175610442351]
We propose a novel graph learning framework that incorporates the node-side and observation-side information.
We use graph signals as functions in the reproducing kernel Hilbert space associated with a Kronecker product kernel.
We develop a novel graph-based regularisation method which, when combined with the Kronecker product kernel, enables our model to capture both the dependency explained by the graph and the dependency due to graph signals.
arXiv Detail & Related papers (2020-08-23T16:04:23Z) - Graph Pooling with Node Proximity for Hierarchical Representation
Learning [80.62181998314547]
We propose a novel graph pooling strategy that leverages node proximity to improve the hierarchical representation learning of graph data with their multi-hop topology.
Results show that the proposed graph pooling strategy is able to achieve state-of-the-art performance on a collection of public graph classification benchmark datasets.
arXiv Detail & Related papers (2020-06-19T13:09:44Z) - Geometrically Principled Connections in Graph Neural Networks [66.51286736506658]
We argue geometry should remain the primary driving force behind innovation in the emerging field of geometric deep learning.
We relate graph neural networks to widely successful computer graphics and data approximation models: radial basis functions (RBFs)
We introduce affine skip connections, a novel building block formed by combining a fully connected layer with any graph convolution operator.
arXiv Detail & Related papers (2020-04-06T13:25:46Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.