Quiver Signal Processing (QSP)
- URL: http://arxiv.org/abs/2010.11525v1
- Date: Thu, 22 Oct 2020 08:40:15 GMT
- Title: Quiver Signal Processing (QSP)
- Authors: Alejandro Parada-Mayorga, Hans Riess, Alejandro Ribeiro, and Robert
Ghrist
- Abstract summary: We state the basics for a signal processing framework on quiver representations.
We propose a signal processing framework that allows us to handle heterogeneous multidimensional information in networks.
- Score: 145.6921439353007
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In this paper we state the basics for a signal processing framework on quiver
representations. A quiver is a directed graph and a quiver representation is an
assignment of vector spaces to the nodes of the graph and of linear maps
between the vector spaces associated to the nodes. Leveraging the tools from
representation theory, we propose a signal processing framework that allows us
to handle heterogeneous multidimensional information in networks. We provide a
set of examples where this framework provides a natural set of tools to
understand apparently hidden structure in information. We remark that the
proposed framework states the basis for building graph neural networks where
information can be processed and handled in alternative ways.
Related papers
- Learning to Model Graph Structural Information on MLPs via Graph Structure Self-Contrasting [50.181824673039436]
We propose a Graph Structure Self-Contrasting (GSSC) framework that learns graph structural information without message passing.
The proposed framework is based purely on Multi-Layer Perceptrons (MLPs), where the structural information is only implicitly incorporated as prior knowledge.
It first applies structural sparsification to remove potentially uninformative or noisy edges in the neighborhood, and then performs structural self-contrasting in the sparsified neighborhood to learn robust node representations.
arXiv Detail & Related papers (2024-09-09T12:56:02Z) - Shortest Path Networks for Graph Property Prediction [13.986963122264632]
Most graph neural network models rely on a particular message passing paradigm, where the idea is to iteratively propagate node representations of a graph to each node in the direct neighborhood.
We propose shortest path message passing neural networks, where the node representations of a graph are propagated to each node in the shortest path neighborhoods.
Our framework generalizes message passing neural networks, resulting in provably more expressive models.
arXiv Detail & Related papers (2022-06-02T12:04:29Z) - Spectral Graph Convolutional Networks With Lifting-based Adaptive Graph
Wavelets [81.63035727821145]
Spectral graph convolutional networks (SGCNs) have been attracting increasing attention in graph representation learning.
We propose a novel class of spectral graph convolutional networks that implement graph convolutions with adaptive graph wavelets.
arXiv Detail & Related papers (2021-08-03T17:57:53Z) - Signal Processing on Higher-Order Networks: Livin' on the Edge ... and
Beyond [20.422050836383725]
This tutorial paper presents a didactic treatment of the emerging topic of signal processing on higher-order networks.
We introduce the building blocks for processing data on simplicial complexes and hypergraphs.
arXiv Detail & Related papers (2021-01-14T09:08:26Z) - Towards Efficient Scene Understanding via Squeeze Reasoning [71.1139549949694]
We propose a novel framework called Squeeze Reasoning.
Instead of propagating information on the spatial map, we first learn to squeeze the input feature into a channel-wise global vector.
We show that our approach can be modularized as an end-to-end trained block and can be easily plugged into existing networks.
arXiv Detail & Related papers (2020-11-06T12:17:01Z) - Graph Fairing Convolutional Networks for Anomaly Detection [7.070726553564701]
We introduce a graph convolutional network with skip connections for semi-supervised anomaly detection.
The effectiveness of our model is demonstrated through extensive experiments on five benchmark datasets.
arXiv Detail & Related papers (2020-10-20T13:45:47Z) - Locality Preserving Dense Graph Convolutional Networks with Graph
Context-Aware Node Representations [19.623379678611744]
Graph convolutional networks (GCNs) have been widely used for representation learning on graph data.
In many graph classification applications, GCN-based approaches have outperformed traditional methods.
We propose a locality-preserving dense GCN with graph context-aware node representations.
arXiv Detail & Related papers (2020-10-12T02:12:27Z) - Spectral Embedding of Graph Networks [76.27138343125985]
We introduce an unsupervised graph embedding that trades off local node similarity and connectivity, and global structure.
The embedding is based on a generalized graph Laplacian, whose eigenvectors compactly capture both network structure and neighborhood proximity in a single representation.
arXiv Detail & Related papers (2020-09-30T04:59:10Z) - Progressive Graph Convolutional Networks for Semi-Supervised Node
Classification [97.14064057840089]
Graph convolutional networks have been successful in addressing graph-based tasks such as semi-supervised node classification.
We propose a method to automatically build compact and task-specific graph convolutional networks.
arXiv Detail & Related papers (2020-03-27T08:32:16Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.