Convolutional Learning on Multigraphs
- URL: http://arxiv.org/abs/2209.11354v1
- Date: Fri, 23 Sep 2022 00:33:04 GMT
- Title: Convolutional Learning on Multigraphs
- Authors: Landon Butler, Alejandro Parada-Mayorga, Alejandro Ribeiro
- Abstract summary: We develop convolutional information processing on multigraphs and introduce convolutional multigraph neural networks (MGNNs)
To capture the complex dynamics of information diffusion within and across each of the multigraph's classes of edges, we formalize a convolutional signal processing model.
We develop a multigraph learning architecture, including a sampling procedure to reduce computational complexity.
The introduced architecture is applied towards optimal wireless resource allocation and a hate speech localization task, offering improved performance over traditional graph neural networks.
- Score: 153.20329791008095
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Graph convolutional learning has led to many exciting discoveries in diverse
areas. However, in some applications, traditional graphs are insufficient to
capture the structure and intricacies of the data. In such scenarios,
multigraphs arise naturally as discrete structures in which complex dynamics
can be embedded. In this paper, we develop convolutional information processing
on multigraphs and introduce convolutional multigraph neural networks (MGNNs).
To capture the complex dynamics of information diffusion within and across each
of the multigraph's classes of edges, we formalize a convolutional signal
processing model, defining the notions of signals, filtering, and frequency
representations on multigraphs. Leveraging this model, we develop a multigraph
learning architecture, including a sampling procedure to reduce computational
complexity. The introduced architecture is applied towards optimal wireless
resource allocation and a hate speech localization task, offering improved
performance over traditional graph neural networks.
Related papers
- Representation learning in multiplex graphs: Where and how to fuse
information? [5.0235828656754915]
Multiplex graphs possess richer information, provide better modeling capabilities and integrate more detailed data from potentially different sources.
In this paper, we tackle the problem of learning representations for nodes in multiplex networks in an unsupervised or self-supervised manner.
We propose improvements in how to construct GNN architectures that deal with multiplex graphs.
arXiv Detail & Related papers (2024-02-27T21:47:06Z) - Learning with Multigraph Convolutional Filters [153.20329791008095]
We introduce multigraph convolutional neural networks (MGNNs) as stacked and layered structures where information is processed according to an MSP model.
We also develop a procedure for tractable computation of filter coefficients in the MGNNs and a low cost method to reduce the dimensionality of the information transferred between layers.
arXiv Detail & Related papers (2022-10-28T17:00:50Z) - Subgraph Frequency Distribution Estimation using Graph Neural Networks [17.02487540304784]
We propose GNNS, a novel representational learning framework that utilizes graph neural networks to sample subgraphs efficiently for estimating their frequency distribution.
Our framework includes an inference model and a generative model that learns hierarchical embeddings of nodes, subgraphs, and graph types.
With the learned model and embeddings, subgraphs are sampled in a highly scalable and parallel way and the frequency distribution estimation is then performed based on these sampled subgraphs.
arXiv Detail & Related papers (2022-07-14T06:23:38Z) - Learning through structure: towards deep neuromorphic knowledge graph
embeddings [0.5906031288935515]
We propose a strategy to map deep graph learning architectures for knowledge graph reasoning to neuromorphic architectures.
Based on the insight that randomly and untrained graph neural networks are able to preserve local graph structures, we compose a frozen neural network shallow knowledge graph embedding models.
We experimentally show that already on conventional computing hardware, this leads to a significant speedup and memory reduction while maintaining a competitive performance level.
arXiv Detail & Related papers (2021-09-21T18:01:04Z) - Spatio-Temporal Inception Graph Convolutional Networks for
Skeleton-Based Action Recognition [126.51241919472356]
We design a simple and highly modularized graph convolutional network architecture for skeleton-based action recognition.
Our network is constructed by repeating a building block that aggregates multi-granularity information from both the spatial and temporal paths.
arXiv Detail & Related papers (2020-11-26T14:43:04Z) - Graph-Based Neural Network Models with Multiple Self-Supervised
Auxiliary Tasks [79.28094304325116]
Graph Convolutional Networks are among the most promising approaches for capturing relationships among structured data points.
We propose three novel self-supervised auxiliary tasks to train graph-based neural network models in a multi-task fashion.
arXiv Detail & Related papers (2020-11-14T11:09:51Z) - Hcore-Init: Neural Network Initialization based on Graph Degeneracy [22.923756039561194]
We propose an adapted version of the k-core structure for the complete weighted multipartite graph extracted from a deep learning architecture.
As a multipartite graph is a combination of bipartite graphs, that are in turn the incidence graphs of hypergraphs, we design k-hypercore decomposition.
arXiv Detail & Related papers (2020-04-16T12:57:14Z) - Geometrically Principled Connections in Graph Neural Networks [66.51286736506658]
We argue geometry should remain the primary driving force behind innovation in the emerging field of geometric deep learning.
We relate graph neural networks to widely successful computer graphics and data approximation models: radial basis functions (RBFs)
We introduce affine skip connections, a novel building block formed by combining a fully connected layer with any graph convolution operator.
arXiv Detail & Related papers (2020-04-06T13:25:46Z) - Tensor Graph Convolutional Networks for Multi-relational and Robust
Learning [74.05478502080658]
This paper introduces a tensor-graph convolutional network (TGCN) for scalable semi-supervised learning (SSL) from data associated with a collection of graphs, that are represented by a tensor.
The proposed architecture achieves markedly improved performance relative to standard GCNs, copes with state-of-the-art adversarial attacks, and leads to remarkable SSL performance over protein-to-protein interaction networks.
arXiv Detail & Related papers (2020-03-15T02:33:21Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.