Topological Deep Learning: Going Beyond Graph Data
- URL: http://arxiv.org/abs/2206.00606v3
- Date: Fri, 19 May 2023 22:13:16 GMT
- Title: Topological Deep Learning: Going Beyond Graph Data
- Authors: Mustafa Hajij, Ghada Zamzmi, Theodore Papamarkou, Nina Miolane, Aldo
Guzm\'an-S\'aenz, Karthikeyan Natesan Ramamurthy, Tolga Birdal, Tamal K. Dey,
Soham Mukherjee, Shreyas N. Samaga, Neal Livesay, Robin Walters, Paul Rosen,
Michael T. Schaub
- Abstract summary: We present a unifying deep learning framework built upon a richer data structure that includes widely adopted topological domains.
Specifically, we first introduce complexes, a novel type of topological domain.
We develop a class of message-passing complex neural networks (CCNNs) focusing primarily on attention-based CCNNs.
- Score: 26.325857542512047
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Topological deep learning is a rapidly growing field that pertains to the
development of deep learning models for data supported on topological domains
such as simplicial complexes, cell complexes, and hypergraphs, which generalize
many domains encountered in scientific computations. In this paper, we present
a unifying deep learning framework built upon a richer data structure that
includes widely adopted topological domains.
Specifically, we first introduce combinatorial complexes, a novel type of
topological domain. Combinatorial complexes can be seen as generalizations of
graphs that maintain certain desirable properties. Similar to hypergraphs,
combinatorial complexes impose no constraints on the set of relations. In
addition, combinatorial complexes permit the construction of hierarchical
higher-order relations, analogous to those found in simplicial and cell
complexes. Thus, combinatorial complexes generalize and combine useful traits
of both hypergraphs and cell complexes, which have emerged as two promising
abstractions that facilitate the generalization of graph neural networks to
topological spaces.
Second, building upon combinatorial complexes and their rich combinatorial
and algebraic structure, we develop a general class of message-passing
combinatorial complex neural networks (CCNNs), focusing primarily on
attention-based CCNNs. We characterize permutation and orientation
equivariances of CCNNs, and discuss pooling and unpooling operations within
CCNNs in detail.
Third, we evaluate the performance of CCNNs on tasks related to mesh shape
analysis and graph learning. Our experiments demonstrate that CCNNs have
competitive performance as compared to state-of-the-art deep learning models
specifically tailored to the same tasks. Our findings demonstrate the
advantages of incorporating higher-order relations into deep learning models in
different applications.
Related papers
- Defining Neural Network Architecture through Polytope Structures of Dataset [53.512432492636236]
This paper defines upper and lower bounds for neural network widths, which are informed by the polytope structure of the dataset in question.
We develop an algorithm to investigate a converse situation where the polytope structure of a dataset can be inferred from its corresponding trained neural networks.
It is established that popular datasets such as MNIST, Fashion-MNIST, and CIFAR10 can be efficiently encapsulated using no more than two polytopes with a small number of faces.
arXiv Detail & Related papers (2024-02-04T08:57:42Z) - DGNN: Decoupled Graph Neural Networks with Structural Consistency
between Attribute and Graph Embedding Representations [62.04558318166396]
Graph neural networks (GNNs) demonstrate a robust capability for representation learning on graphs with complex structures.
A novel GNNs framework, dubbed Decoupled Graph Neural Networks (DGNN), is introduced to obtain a more comprehensive embedding representation of nodes.
Experimental results conducted on several graph benchmark datasets verify DGNN's superiority in node classification task.
arXiv Detail & Related papers (2024-01-28T06:43:13Z) - Inferring community structure in attributed hypergraphs using stochastic
block models [3.335932527835653]
We develop a statistical framework that incorporates node attribute data into the learning of community structure in a hypergraph.
We demonstrate that our model, which we refer to as HyperNEO, enhances the learning of community structure in synthetic and empirical hypergraphs.
We expect that our framework will broaden the investigation and understanding of higher-order community structure in real-world complex systems.
arXiv Detail & Related papers (2024-01-01T07:31:32Z) - Learning Hierarchical Relational Representations through Relational Convolutions [2.5322020135765464]
We introduce "relational convolutional networks", a neural architecture equipped with computational mechanisms that capture progressively more complex relational features.
A key component of this framework is a novel operation that captures relational patterns in groups of objects by convolving graphlet filters.
We present the motivation and details of the architecture, together with a set of experiments to demonstrate how relational convolutional networks can provide an effective framework for modeling relational tasks that have hierarchical structure.
arXiv Detail & Related papers (2023-10-05T01:22:50Z) - Data Topology-Dependent Upper Bounds of Neural Network Widths [52.58441144171022]
We first show that a three-layer neural network can be designed to approximate an indicator function over a compact set.
This is then extended to a simplicial complex, deriving width upper bounds based on its topological structure.
We prove the universal approximation property of three-layer ReLU networks using our topological approach.
arXiv Detail & Related papers (2023-05-25T14:17:15Z) - Topological Learning in Multi-Class Data Sets [0.3050152425444477]
We study the impact of topological complexity on learning in feedforward deep neural networks (DNNs)
We evaluate our topological classification algorithm on multiple constructed and open source data sets.
arXiv Detail & Related papers (2023-01-23T21:54:25Z) - Deep Architecture Connectivity Matters for Its Convergence: A
Fine-Grained Analysis [94.64007376939735]
We theoretically characterize the impact of connectivity patterns on the convergence of deep neural networks (DNNs) under gradient descent training.
We show that by a simple filtration on "unpromising" connectivity patterns, we can trim down the number of models to evaluate.
arXiv Detail & Related papers (2022-05-11T17:43:54Z) - Universal approximation property of invertible neural networks [76.95927093274392]
Invertible neural networks (INNs) are neural network architectures with invertibility by design.
Thanks to their invertibility and the tractability of Jacobian, INNs have various machine learning applications such as probabilistic modeling, generative modeling, and representation learning.
arXiv Detail & Related papers (2022-04-15T10:45:26Z) - Dist2Cycle: A Simplicial Neural Network for Homology Localization [66.15805004725809]
Simplicial complexes can be viewed as high dimensional generalizations of graphs that explicitly encode multi-way ordered relations.
We propose a graph convolutional model for learning functions parametrized by the $k$-homological features of simplicial complexes.
arXiv Detail & Related papers (2021-10-28T14:59:41Z) - Simplicial Neural Networks [0.0]
We present simplicial neural networks (SNNs)
SNNs are a generalization of graph neural networks to data that live on a class of topological spaces called simplicial complexes.
We test the SNNs on the task of imputing missing data on coauthorship complexes.
arXiv Detail & Related papers (2020-10-07T20:15:01Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.