Architectures of Topological Deep Learning: A Survey of Message-Passing
Topological Neural Networks
- URL: http://arxiv.org/abs/2304.10031v3
- Date: Wed, 21 Feb 2024 23:27:08 GMT
- Title: Architectures of Topological Deep Learning: A Survey of Message-Passing
Topological Neural Networks
- Authors: Mathilde Papillon, Sophia Sanborn, Mustafa Hajij, Nina Miolane
- Abstract summary: Topological Deep Learning (TDL) provides a framework to process and extract knowledge from data associated with complex systems.
TDL has demonstrated theoretical and practical advantages that hold the promise of breaking ground in the applied sciences and beyond.
The rapid growth of the TDL literature for relational systems has led to a lack of unification in notation and language across message-passing Topological Neural Network (TNN) architectures.
This presents a real obstacle for building upon existing works and for deploying message-passing TNNs to new real-world problems.
- Score: 5.324479750432588
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: The natural world is full of complex systems characterized by intricate
relations between their components: from social interactions between
individuals in a social network to electrostatic interactions between atoms in
a protein. Topological Deep Learning (TDL) provides a comprehensive framework
to process and extract knowledge from data associated with these systems, such
as predicting the social community to which an individual belongs or predicting
whether a protein can be a reasonable target for drug development. TDL has
demonstrated theoretical and practical advantages that hold the promise of
breaking ground in the applied sciences and beyond. However, the rapid growth
of the TDL literature for relational systems has also led to a lack of
unification in notation and language across message-passing Topological Neural
Network (TNN) architectures. This presents a real obstacle for building upon
existing works and for deploying message-passing TNNs to new real-world
problems. To address this issue, we provide an accessible introduction to TDL
for relational systems, and compare the recently published message-passing TNNs
using a unified mathematical and graphical notation. Through an intuitive and
critical review of the emerging field of TDL, we extract valuable insights into
current challenges and exciting opportunities for future development.
Related papers
- Neurosymbolic AI approach to Attribution in Large Language Models [5.3454230926797734]
Neurosymbolic AI (NesyAI) combines the strengths of neural networks with structured symbolic reasoning.
This paper explores how NesyAI frameworks can enhance existing attribution models, offering more reliable, interpretable, and adaptable systems.
arXiv Detail & Related papers (2024-09-30T02:20:36Z) - Position: Topological Deep Learning is the New Frontier for Relational Learning [51.05869778335334]
Topological deep learning (TDL) is a rapidly evolving field that uses topological features to understand and design deep learning models.
This paper posits that TDL is the new frontier for relational learning.
arXiv Detail & Related papers (2024-02-14T00:35:10Z) - Graph Neural Networks for Tabular Data Learning: A Survey with Taxonomy
and Directions [10.753191494611892]
We dive into Tabular Data Learning using Graph Neural Networks (GNNs)
GNNs have garnered significant interest and application across various Tabular Data Learning domains.
This survey serves as a resource for researchers and practitioners, offering a thorough understanding of GNNs' role in revolutionizing TDL.
arXiv Detail & Related papers (2024-01-04T08:49:10Z) - Persistence-based operators in machine learning [62.997667081978825]
We introduce a class of persistence-based neural network layers.
Persistence-based layers allow the users to easily inject knowledge about symmetries respected by the data, are equipped with learnable weights, and can be composed with state-of-the-art neural architectures.
arXiv Detail & Related papers (2022-12-28T18:03:41Z) - The Neural Race Reduction: Dynamics of Abstraction in Gated Networks [12.130628846129973]
We introduce the Gated Deep Linear Network framework that schematizes how pathways of information flow impact learning dynamics.
We derive an exact reduction and, for certain cases, exact solutions to the dynamics of learning.
Our work gives rise to general hypotheses relating neural architecture to learning and provides a mathematical approach towards understanding the design of more complex architectures.
arXiv Detail & Related papers (2022-07-21T12:01:03Z) - Knowledge Enhanced Neural Networks for relational domains [83.9217787335878]
We focus on a specific method, KENN, a Neural-Symbolic architecture that injects prior logical knowledge into a neural network.
In this paper, we propose an extension of KENN for relational data.
arXiv Detail & Related papers (2022-05-31T13:00:34Z) - Learning Contact Dynamics using Physically Structured Neural Networks [81.73947303886753]
We use connections between deep neural networks and differential equations to design a family of deep network architectures for representing contact dynamics between objects.
We show that these networks can learn discontinuous contact events in a data-efficient manner from noisy observations.
Our results indicate that an idealised form of touch feedback is a key component of making this learning problem tractable.
arXiv Detail & Related papers (2021-02-22T17:33:51Z) - Inter-layer Information Similarity Assessment of Deep Neural Networks
Via Topological Similarity and Persistence Analysis of Data Neighbour
Dynamics [93.4221402881609]
The quantitative analysis of information structure through a deep neural network (DNN) can unveil new insights into the theoretical performance of DNN architectures.
Inspired by both LS and ID strategies for quantitative information structure analysis, we introduce two novel complimentary methods for inter-layer information similarity assessment.
We demonstrate their efficacy in this study by performing analysis on a deep convolutional neural network architecture on image data.
arXiv Detail & Related papers (2020-12-07T15:34:58Z) - On Computability, Learnability and Extractability of Finite State
Machines from Recurrent Neural Networks [0.0]
This work aims at shedding some light on connections between finite state machines (FSMs), and recurrent neural networks (RNNs)
Examined connections in this master's thesis is threefold: the extractability of finite state machines from recurrent neural networks, learnability aspects and computationnal links.
arXiv Detail & Related papers (2020-09-10T15:55:30Z) - Learning Connectivity of Neural Networks from a Topological Perspective [80.35103711638548]
We propose a topological perspective to represent a network into a complete graph for analysis.
By assigning learnable parameters to the edges which reflect the magnitude of connections, the learning process can be performed in a differentiable manner.
This learning process is compatible with existing networks and owns adaptability to larger search spaces and different tasks.
arXiv Detail & Related papers (2020-08-19T04:53:31Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.