Classification of network topology and dynamics via sequence
characterization
- URL: http://arxiv.org/abs/2206.15190v1
- Date: Thu, 30 Jun 2022 11:05:39 GMT
- Title: Classification of network topology and dynamics via sequence
characterization
- Authors: Lucas Guerreiro, Filipi N. Silva and Diego R. Amancio
- Abstract summary: We investigate whether the reconstruction of the network via the co-occurrence method is useful to recover both the network topology and agent dynamics generating sequences.
We found that the characterization of reconstructed networks provides valuable information regarding the process and topology used to create the sequences.
- Score: 0.1611401281366893
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Sequences arise in many real-world scenarios; thus, identifying the
mechanisms behind symbol generation is essential to understanding many complex
systems. This paper analyzes sequences generated by agents walking on a
networked topology. Given that in many real scenarios, the underlying processes
generating the sequence is hidden, we investigate whether the reconstruction of
the network via the co-occurrence method is useful to recover both the network
topology and agent dynamics generating sequences. We found that the
characterization of reconstructed networks provides valuable information
regarding the process and topology used to create the sequences. In a machine
learning approach considering 16 combinations of network topology and agent
dynamics as classes, we obtained an accuracy of 87% with sequences generated
with less than 40% of nodes visited. Larger sequences turned out to generate
improved machine learning models. Our findings suggest that the proposed
methodology could be extended to classify sequences and understand the
mechanisms behind sequence generation.
Related papers
- Coding schemes in neural networks learning classification tasks [52.22978725954347]
We investigate fully-connected, wide neural networks learning classification tasks.
We show that the networks acquire strong, data-dependent features.
Surprisingly, the nature of the internal representations depends crucially on the neuronal nonlinearity.
arXiv Detail & Related papers (2024-06-24T14:50:05Z) - Symbolic Regression of Dynamic Network Models [0.0]
We introduce a novel formulation of a network generator and a parameter-free fitness function to evaluate the generated network.
We extend this approach by modifying generator semantics to create and retrieve rules for time-varying networks.
The framework was then used on three empirical datasets - subway networks of major cities, regions of street networks and semantic co-occurrence networks of literature in Artificial Intelligence.
arXiv Detail & Related papers (2023-12-15T00:34:45Z) - How neural networks learn to classify chaotic time series [77.34726150561087]
We study the inner workings of neural networks trained to classify regular-versus-chaotic time series.
We find that the relation between input periodicity and activation periodicity is key for the performance of LKCNN models.
arXiv Detail & Related papers (2023-06-04T08:53:27Z) - Seq-HyGAN: Sequence Classification via Hypergraph Attention Network [0.0]
Sequence classification has a wide range of real-world applications in different domains, such as genome classification in health and anomaly detection in business.
The lack of explicit features in sequence data makes it difficult for machine learning models.
We propose a novel Hypergraph Attention Network model, namely Seq-HyGAN.
arXiv Detail & Related papers (2023-03-04T11:53:33Z) - Entangled Residual Mappings [59.02488598557491]
We introduce entangled residual mappings to generalize the structure of the residual connections.
An entangled residual mapping replaces the identity skip connections with specialized entangled mappings.
We show that while entangled mappings can preserve the iterative refinement of features across various deep models, they influence the representation learning process in convolutional networks.
arXiv Detail & Related papers (2022-06-02T19:36:03Z) - A scalable multi-step least squares method for network identification
with unknown disturbance topology [0.0]
We present an identification method for dynamic networks with known network topology.
We use a multi-step Sequential and Null Space Fitting method to deal with reduced rank noise.
We provide a consistency proof that includes explicit-based Box model structure informativity.
arXiv Detail & Related papers (2021-06-14T16:12:49Z) - Phase Retrieval using Expectation Consistent Signal Recovery Algorithm
based on Hypernetwork [73.94896986868146]
Phase retrieval is an important component in modern computational imaging systems.
Recent advances in deep learning have opened up a new possibility for robust and fast PR.
We develop a novel framework for deep unfolding to overcome the existing limitations.
arXiv Detail & Related papers (2021-01-12T08:36:23Z) - Estimating Linear Dynamical Networks of Cyclostationary Processes [0.0]
We present a novel algorithm for guaranteed topology learning in networks excited by cyclostationary processes.
Unlike prior work, the framework applies to linear dynamic system with complex valued dependencies.
In the second part of the article, we analyze conditions for consistent topology learning for bidirected radial networks when a subset of the network is unobserved.
arXiv Detail & Related papers (2020-09-26T18:54:50Z) - Neural networks adapting to datasets: learning network size and topology [77.34726150561087]
We introduce a flexible setup allowing for a neural network to learn both its size and topology during the course of a gradient-based training.
The resulting network has the structure of a graph tailored to the particular learning task and dataset.
arXiv Detail & Related papers (2020-06-22T12:46:44Z) - Modeling Dynamic Heterogeneous Network for Link Prediction using
Hierarchical Attention with Temporal RNN [16.362525151483084]
We propose a novel dynamic heterogeneous network embedding method, termed as DyHATR.
It uses hierarchical attention to learn heterogeneous information and incorporates recurrent neural networks with temporal attention to capture evolutionary patterns.
We benchmark our method on four real-world datasets for the task of link prediction.
arXiv Detail & Related papers (2020-04-01T17:16:47Z) - Progressive Graph Convolutional Networks for Semi-Supervised Node
Classification [97.14064057840089]
Graph convolutional networks have been successful in addressing graph-based tasks such as semi-supervised node classification.
We propose a method to automatically build compact and task-specific graph convolutional networks.
arXiv Detail & Related papers (2020-03-27T08:32:16Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.