Topological Neural Networks go Persistent, Equivariant, and Continuous
- URL: http://arxiv.org/abs/2406.03164v1
- Date: Wed, 5 Jun 2024 11:56:54 GMT
- Title: Topological Neural Networks go Persistent, Equivariant, and Continuous
- Authors: Yogesh Verma, Amauri H Souza, Vikas Garg,
- Abstract summary: We introduce TopNets as a broad framework that subsumes and unifies various methods in the intersection of GNNs/TNNs and PH.
TopNets achieve strong performance across diverse tasks, including antibody design, molecular dynamics simulation, and drug property prediction.
- Score: 6.314000948709255
- License: http://creativecommons.org/licenses/by-sa/4.0/
- Abstract: Topological Neural Networks (TNNs) incorporate higher-order relational information beyond pairwise interactions, enabling richer representations than Graph Neural Networks (GNNs). Concurrently, topological descriptors based on persistent homology (PH) are being increasingly employed to augment the GNNs. We investigate the benefits of integrating these two paradigms. Specifically, we introduce TopNets as a broad framework that subsumes and unifies various methods in the intersection of GNNs/TNNs and PH such as (generalizations of) RePHINE and TOGL. TopNets can also be readily adapted to handle (symmetries in) geometric complexes, extending the scope of TNNs and PH to spatial settings. Theoretically, we show that PH descriptors can provably enhance the expressivity of simplicial message-passing networks. Empirically, (continuous and E(n)-equivariant extensions of) TopNets achieve strong performance across diverse tasks, including antibody design, molecular dynamics simulation, and drug property prediction.
Related papers
- E(n) Equivariant Topological Neural Networks [10.603892843083173]
Graph neural networks excel at modeling pairwise interactions, but they cannot flexibly accommodate higher-order interactions and features.
Topological deep learning (TDL) has emerged recently as a promising tool for addressing this issue.
This paper introduces E(n)-Equivariant Topological Neural Networks (ETNNs)
ETNNs incorporate geometric node features while respecting rotation, reflection, and translation.
arXiv Detail & Related papers (2024-05-24T10:55:38Z) - Deep Neural Networks via Complex Network Theory: a Perspective [3.1023851130450684]
Deep Neural Networks (DNNs) can be represented as graphs whose links and vertices iteratively process data and solve tasks sub-optimally. Complex Network Theory (CNT), merging statistical physics with graph theory, provides a method for interpreting neural networks by analysing their weights and neuron structures.
In this work, we extend the existing CNT metrics with measures that sample from the DNNs' training distribution, shifting from a purely topological analysis to one that connects with the interpretability of deep learning.
arXiv Detail & Related papers (2024-04-17T08:42:42Z) - Understanding and Improving Deep Graph Neural Networks: A Probabilistic
Graphical Model Perspective [22.82625446308785]
We propose a novel view for understanding graph neural networks (GNNs)
In this work, we focus on deep GNNs and propose a novel view for understanding them.
We design a more powerful GNN: coupling graph neural network (CoGNet)
arXiv Detail & Related papers (2023-01-25T12:02:12Z) - Relation Embedding based Graph Neural Networks for Handling
Heterogeneous Graph [58.99478502486377]
We propose a simple yet efficient framework to make the homogeneous GNNs have adequate ability to handle heterogeneous graphs.
Specifically, we propose Relation Embedding based Graph Neural Networks (RE-GNNs), which employ only one parameter per relation to embed the importance of edge type relations and self-loop connections.
arXiv Detail & Related papers (2022-09-23T05:24:18Z) - Extrapolation and Spectral Bias of Neural Nets with Hadamard Product: a
Polynomial Net Study [55.12108376616355]
The study on NTK has been devoted to typical neural network architectures, but is incomplete for neural networks with Hadamard products (NNs-Hp)
In this work, we derive the finite-width-K formulation for a special class of NNs-Hp, i.e., neural networks.
We prove their equivalence to the kernel regression predictor with the associated NTK, which expands the application scope of NTK.
arXiv Detail & Related papers (2022-09-16T06:36:06Z) - Deep Neural Networks as Complex Networks [1.704936863091649]
We use Complex Network Theory to represent Deep Neural Networks (DNNs) as directed weighted graphs.
We introduce metrics to study DNNs as dynamical systems, with a granularity that spans from weights to layers, including neurons.
We show that our metrics discriminate low vs. high performing networks.
arXiv Detail & Related papers (2022-09-12T16:26:04Z) - Simple and Efficient Heterogeneous Graph Neural Network [55.56564522532328]
Heterogeneous graph neural networks (HGNNs) have powerful capability to embed rich structural and semantic information of a heterogeneous graph into node representations.
Existing HGNNs inherit many mechanisms from graph neural networks (GNNs) over homogeneous graphs, especially the attention mechanism and the multi-layer structure.
This paper conducts an in-depth and detailed study of these mechanisms and proposes Simple and Efficient Heterogeneous Graph Neural Network (SeHGNN)
arXiv Detail & Related papers (2022-07-06T10:01:46Z) - Deep Architecture Connectivity Matters for Its Convergence: A
Fine-Grained Analysis [94.64007376939735]
We theoretically characterize the impact of connectivity patterns on the convergence of deep neural networks (DNNs) under gradient descent training.
We show that by a simple filtration on "unpromising" connectivity patterns, we can trim down the number of models to evaluate.
arXiv Detail & Related papers (2022-05-11T17:43:54Z) - Universal approximation property of invertible neural networks [76.95927093274392]
Invertible neural networks (INNs) are neural network architectures with invertibility by design.
Thanks to their invertibility and the tractability of Jacobian, INNs have various machine learning applications such as probabilistic modeling, generative modeling, and representation learning.
arXiv Detail & Related papers (2022-04-15T10:45:26Z) - AM-GCN: Adaptive Multi-channel Graph Convolutional Networks [85.0332394224503]
We study whether Graph Convolutional Networks (GCNs) can optimally integrate node features and topological structures in a complex graph with rich information.
We propose an adaptive multi-channel graph convolutional networks for semi-supervised classification (AM-GCN)
Our experiments show that AM-GCN extracts the most correlated information from both node features and topological structures substantially.
arXiv Detail & Related papers (2020-07-05T08:16:03Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.