Pooling Strategies for Simplicial Convolutional Networks
- URL: http://arxiv.org/abs/2210.05490v1
- Date: Tue, 11 Oct 2022 14:45:51 GMT
- Title: Pooling Strategies for Simplicial Convolutional Networks
- Authors: Domenico Mattia Cinque, Claudio Battiloro, Paolo Di Lorenzo
- Abstract summary: The goal of this paper is to introduce pooling strategies for simplicial convolutional neural networks.
Inspired by graph pooling methods, we introduce a general formulation for a simplicial pooling layer.
The general layer is then customized to design four different pooling strategies.
- Score: 18.80397868603073
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: The goal of this paper is to introduce pooling strategies for simplicial
convolutional neural networks. Inspired by graph pooling methods, we introduce
a general formulation for a simplicial pooling layer that performs: i) local
aggregation of simplicial signals; ii) principled selection of sampling sets;
iii) downsampling and simplicial topology adaptation. The general layer is then
customized to design four different pooling strategies (i.e., max, top-k,
self-attention, and separated top-k) grounded in the theory of topological
signal processing. Also, we leverage the proposed layers in a hierarchical
architecture that reduce complexity while representing data at different
resolutions. Numerical results on real data benchmarks (i.e., flow and graph
classification) illustrate the advantage of the proposed methods with respect
to the state of the art.
Related papers
- What Improves the Generalization of Graph Transformers? A Theoretical Dive into the Self-attention and Positional Encoding [67.59552859593985]
Graph Transformers, which incorporate self-attention and positional encoding, have emerged as a powerful architecture for various graph learning tasks.
This paper introduces first theoretical investigation of a shallow Graph Transformer for semi-supervised classification.
arXiv Detail & Related papers (2024-06-04T05:30:16Z) - On Characterizing the Evolution of Embedding Space of Neural Networks
using Algebraic Topology [9.537910170141467]
We study how the topology of feature embedding space changes as it passes through the layers of a well-trained deep neural network (DNN) through Betti numbers.
We demonstrate that as depth increases, a topologically complicated dataset is transformed into a simple one, resulting in Betti numbers attaining their lowest possible value.
arXiv Detail & Related papers (2023-11-08T10:45:12Z) - Structural Entropy Guided Graph Hierarchical Pooling [8.080910755718511]
We propose a hierarchical pooling approach, SEP, to tackle the two issues of local structure damage and suboptimal problem.
SEP outperforms state-of-the-art graph pooling methods on graph classification benchmarks and obtains superior performance on node classifications.
arXiv Detail & Related papers (2022-06-26T06:30:54Z) - Simplicial Attention Networks [0.0]
We introduce a proper self-attention mechanism able to process data components at different layers.
We learn how to weight both upper and lower neighborhoods of the given topological domain in a totally task-oriented fashion.
The proposed approach compares favorably with other methods when applied to different (inductive and transductive) tasks.
arXiv Detail & Related papers (2022-03-14T20:47:31Z) - Subquadratic Overparameterization for Shallow Neural Networks [60.721751363271146]
We provide an analytical framework that allows us to adopt standard neural training strategies.
We achieve the desiderata viaak-Lojasiewicz, smoothness, and standard assumptions.
arXiv Detail & Related papers (2021-11-02T20:24:01Z) - Dist2Cycle: A Simplicial Neural Network for Homology Localization [66.15805004725809]
Simplicial complexes can be viewed as high dimensional generalizations of graphs that explicitly encode multi-way ordered relations.
We propose a graph convolutional model for learning functions parametrized by the $k$-homological features of simplicial complexes.
arXiv Detail & Related papers (2021-10-28T14:59:41Z) - Structure-Aware Hierarchical Graph Pooling using Information Bottleneck [2.7088996845250897]
Graph pooling is an essential ingredient of Graph Neural Networks (GNNs) in graph classification and regression tasks.
We propose a novel pooling method named as HIBPool where we leverage the Information Bottleneck (IB) principle.
We also introduce a novel structure-aware Discriminative Pooling Readout (DiP-Readout) function to capture the informative local subgraph structures in the graph.
arXiv Detail & Related papers (2021-04-27T07:27:43Z) - Progressive Spatio-Temporal Graph Convolutional Network for
Skeleton-Based Human Action Recognition [97.14064057840089]
We propose a method to automatically find a compact and problem-specific network for graph convolutional networks in a progressive manner.
Experimental results on two datasets for skeleton-based human action recognition indicate that the proposed method has competitive or even better classification performance.
arXiv Detail & Related papers (2020-11-11T09:57:49Z) - SimPool: Towards Topology Based Graph Pooling with Structural Similarity
Features [0.0]
This paper proposes two main contributions, the first is a differential module calculating structural similarity features based on the adjacency matrix.
The second main contribution is on integrating these features with a revisited pooling layer DiffPool arXiv:1806.08804 to propose a pooling layer referred to as SimPool.
Experimental results demonstrate that as part of an end-to-end Graph Neural Network architecture SimPool calculates node cluster assignments that resemble more to the locality.
arXiv Detail & Related papers (2020-06-03T12:51:57Z) - Ensemble Wrapper Subsampling for Deep Modulation Classification [70.91089216571035]
Subsampling of received wireless signals is important for relaxing hardware requirements as well as the computational cost of signal processing algorithms.
We propose a subsampling technique to facilitate the use of deep learning for automatic modulation classification in wireless communication systems.
arXiv Detail & Related papers (2020-05-10T06:11:13Z) - Progressive Graph Convolutional Networks for Semi-Supervised Node
Classification [97.14064057840089]
Graph convolutional networks have been successful in addressing graph-based tasks such as semi-supervised node classification.
We propose a method to automatically build compact and task-specific graph convolutional networks.
arXiv Detail & Related papers (2020-03-27T08:32:16Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.