Signal Processing on Higher-Order Networks: Livin' on the Edge ... and
Beyond
- URL: http://arxiv.org/abs/2101.05510v1
- Date: Thu, 14 Jan 2021 09:08:26 GMT
- Title: Signal Processing on Higher-Order Networks: Livin' on the Edge ... and
Beyond
- Authors: Michael T. Schaub and Yu Zhu and Jean-Baptiste Seby and T. Mitchell
Roddenberry and Santiago Segarra
- Abstract summary: This tutorial paper presents a didactic treatment of the emerging topic of signal processing on higher-order networks.
We introduce the building blocks for processing data on simplicial complexes and hypergraphs.
- Score: 20.422050836383725
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: This tutorial paper presents a didactic treatment of the emerging topic of
signal processing on higher-order networks. Drawing analogies from discrete and
graph signal processing, we introduce the building blocks for processing data
on simplicial complexes and hypergraphs, two common abstractions of
higher-order networks that can incorporate polyadic relationships.We provide
basic introductions to simplicial complexes and hypergraphs, making special
emphasis on the concepts needed for processing signals on them. Leveraging
these concepts, we discuss Fourier analysis, signal denoising, signal
interpolation, node embeddings, and non-linear processing through neural
networks in these two representations of polyadic relational structures. In the
context of simplicial complexes, we specifically focus on signal processing
using the Hodge Laplacian matrix, a multi-relational operator that leverages
the special structure of simplicial complexes and generalizes desirable
properties of the Laplacian matrix in graph signal processing. For hypergraphs,
we present both matrix and tensor representations, and discuss the trade-offs
in adopting one or the other. We also highlight limitations and potential
research avenues, both to inform practitioners and to motivate the contribution
of new researchers to the area.
Related papers
- Convolutional Learning on Multigraphs [153.20329791008095]
We develop convolutional information processing on multigraphs and introduce convolutional multigraph neural networks (MGNNs)
To capture the complex dynamics of information diffusion within and across each of the multigraph's classes of edges, we formalize a convolutional signal processing model.
We develop a multigraph learning architecture, including a sampling procedure to reduce computational complexity.
The introduced architecture is applied towards optimal wireless resource allocation and a hate speech localization task, offering improved performance over traditional graph neural networks.
arXiv Detail & Related papers (2022-09-23T00:33:04Z) - Complex-Value Spatio-temporal Graph Convolutional Neural Networks and
its Applications to Electric Power Systems AI [24.914412344973996]
We generalize graph convolutional neural networks (GCN) to the complex domain.
We prove that complex-valued GCNs are stable with respect to perturbations of the underlying graph support.
We apply complex GCN to power grid state forecasting, power grid-attack detection and localization.
arXiv Detail & Related papers (2022-08-17T18:56:48Z) - Deep Equilibrium Assisted Block Sparse Coding of Inter-dependent
Signals: Application to Hyperspectral Imaging [71.57324258813675]
A dataset of inter-dependent signals is defined as a matrix whose columns demonstrate strong dependencies.
A neural network is employed to act as structure prior and reveal the underlying signal interdependencies.
Deep unrolling and Deep equilibrium based algorithms are developed, forming highly interpretable and concise deep-learning-based architectures.
arXiv Detail & Related papers (2022-03-29T21:00:39Z) - Simplicial Attention Networks [0.0]
We introduce a proper self-attention mechanism able to process data components at different layers.
We learn how to weight both upper and lower neighborhoods of the given topological domain in a totally task-oriented fashion.
The proposed approach compares favorably with other methods when applied to different (inductive and transductive) tasks.
arXiv Detail & Related papers (2022-03-14T20:47:31Z) - Dist2Cycle: A Simplicial Neural Network for Homology Localization [66.15805004725809]
Simplicial complexes can be viewed as high dimensional generalizations of graphs that explicitly encode multi-way ordered relations.
We propose a graph convolutional model for learning functions parametrized by the $k$-homological features of simplicial complexes.
arXiv Detail & Related papers (2021-10-28T14:59:41Z) - Signal Processing on Cell Complexes [7.0471949371778795]
We give an introduction to signal processing on (abstract) regular cell complexes.
We discuss how appropriate Hodge Laplacians for these cell complexes can be derived.
arXiv Detail & Related papers (2021-10-11T21:11:59Z) - Signal processing on simplicial complexes [19.035399031968502]
We focus on a closely related, but distinct third perspective: how can we use higher-order relationships to process signals and data supported on higher-order network structures.
In particular, we survey how ideas from signal processing of data supported on regular domains, such as time series or images, can be extended to graphs and simplicial complexes.
arXiv Detail & Related papers (2021-06-14T14:56:51Z) - Quiver Signal Processing (QSP) [145.6921439353007]
We state the basics for a signal processing framework on quiver representations.
We propose a signal processing framework that allows us to handle heterogeneous multidimensional information in networks.
arXiv Detail & Related papers (2020-10-22T08:40:15Z) - Graph signal processing for machine learning: A review and new
perspectives [57.285378618394624]
We review a few important contributions made by GSP concepts and tools, such as graph filters and transforms, to the development of novel machine learning algorithms.
We discuss exploiting data structure and relational priors, improving data and computational efficiency, and enhancing model interpretability.
We provide new perspectives on future development of GSP techniques that may serve as a bridge between applied mathematics and signal processing on one side, and machine learning and network science on the other.
arXiv Detail & Related papers (2020-07-31T13:21:33Z) - Towards Deeper Graph Neural Networks [63.46470695525957]
Graph convolutions perform neighborhood aggregation and represent one of the most important graph operations.
Several recent studies attribute this performance deterioration to the over-smoothing issue.
We propose Deep Adaptive Graph Neural Network (DAGNN) to adaptively incorporate information from large receptive fields.
arXiv Detail & Related papers (2020-07-18T01:11:14Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.