Symbolic Regression of Dynamic Network Models
- URL: http://arxiv.org/abs/2401.05369v1
- Date: Fri, 15 Dec 2023 00:34:45 GMT
- Title: Symbolic Regression of Dynamic Network Models
- Authors: Govind Gandhi
- Abstract summary: We introduce a novel formulation of a network generator and a parameter-free fitness function to evaluate the generated network.
We extend this approach by modifying generator semantics to create and retrieve rules for time-varying networks.
The framework was then used on three empirical datasets - subway networks of major cities, regions of street networks and semantic co-occurrence networks of literature in Artificial Intelligence.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Growing interest in modelling complex systems from brains to societies to
cities using networks has led to increased efforts to describe generative
processes that explain those networks. Recent successes in machine learning
have prompted the usage of evolutionary computation, especially genetic
programming to evolve computer programs that effectively forage a
multidimensional search space to iteratively find better solutions that explain
network structure. Symbolic regression contributes to these approaches by
replicating network morphologies using both structure and processes, all while
not relying on the scientists intuition or expertise. It distinguishes itself
by introducing a novel formulation of a network generator and a parameter-free
fitness function to evaluate the generated network and is found to consistently
retrieve synthetically generated growth processes as well as simple,
interpretable rules for a range of empirical networks. We extend this approach
by modifying generator semantics to create and retrieve rules for time-varying
networks. Lexicon to study networks created dynamically in multiple stages is
introduced. The framework was improved using methods from the genetic
programming toolkit (recombination) and computational improvements (using
heuristic distance measures) and used to test the consistency and robustness of
the upgrades to the semantics using synthetically generated networks. Using
recombination was found to improve retrieval rate and fitness of the solutions.
The framework was then used on three empirical datasets - subway networks of
major cities, regions of street networks and semantic co-occurrence networks of
literature in Artificial Intelligence to illustrate the possibility of
obtaining interpretable, decentralised growth processes from complex networks.
Related papers
- Leveraging advances in machine learning for the robust classification and interpretation of networks [0.0]
Simulation approaches involve selecting a suitable network generative model such as Erd"os-R'enyi or small-world.
We utilize advances in interpretable machine learning to classify simulated networks by our generative models based on various network attributes.
arXiv Detail & Related papers (2024-03-20T00:24:23Z) - Riemannian Residual Neural Networks [58.925132597945634]
We show how to extend the residual neural network (ResNet)
ResNets have become ubiquitous in machine learning due to their beneficial learning properties, excellent empirical results, and easy-to-incorporate nature when building varied neural networks.
arXiv Detail & Related papers (2023-10-16T02:12:32Z) - Generalization and Estimation Error Bounds for Model-based Neural
Networks [78.88759757988761]
We show that the generalization abilities of model-based networks for sparse recovery outperform those of regular ReLU networks.
We derive practical design rules that allow to construct model-based networks with guaranteed high generalization.
arXiv Detail & Related papers (2023-04-19T16:39:44Z) - Multi-agent Reinforcement Learning with Graph Q-Networks for Antenna
Tuning [60.94661435297309]
The scale of mobile networks makes it challenging to optimize antenna parameters using manual intervention or hand-engineered strategies.
We propose a new multi-agent reinforcement learning algorithm to optimize mobile network configurations globally.
We empirically demonstrate the performance of the algorithm on an antenna tilt tuning problem and a joint tilt and power control problem in a simulated environment.
arXiv Detail & Related papers (2023-01-20T17:06:34Z) - Recursive Construction of Stable Assemblies of Recurrent Neural Networks [0.0]
Advanced applications of machine learning will likely involve combinations of trained networks.
This paper takes a step in this direction by establishing contraction properties of broad classes of nonlinear recurrent networks and neural ODEs.
Results can be used to combine recurrent networks and physical systems with quantified contraction properties.
arXiv Detail & Related papers (2021-06-16T16:35:50Z) - Anomaly Detection on Attributed Networks via Contrastive Self-Supervised
Learning [50.24174211654775]
We present a novel contrastive self-supervised learning framework for anomaly detection on attributed networks.
Our framework fully exploits the local information from network data by sampling a novel type of contrastive instance pair.
A graph neural network-based contrastive learning model is proposed to learn informative embedding from high-dimensional attributes and local structure.
arXiv Detail & Related papers (2021-02-27T03:17:20Z) - Firefly Neural Architecture Descent: a General Approach for Growing
Neural Networks [50.684661759340145]
Firefly neural architecture descent is a general framework for progressively and dynamically growing neural networks.
We show that firefly descent can flexibly grow networks both wider and deeper, and can be applied to learn accurate but resource-efficient neural architectures.
In particular, it learns networks that are smaller in size but have higher average accuracy than those learned by the state-of-the-art methods.
arXiv Detail & Related papers (2021-02-17T04:47:18Z) - Learning low-rank latent mesoscale structures in networks [1.1470070927586016]
We present a new approach for describing low-rank mesoscale structures in networks.
We use several synthetic network models and empirical friendship, collaboration, and protein--protein interaction (PPI) networks.
We show how to denoise a corrupted network by using only the latent motifs that one learns directly from the corrupted network.
arXiv Detail & Related papers (2021-02-13T18:54:49Z) - Online Estimation and Community Detection of Network Point Processes for
Event Streams [12.211623200731788]
A common goal in network modeling is to uncover the latent community structure present among nodes.
We propose a fast online variational inference algorithm for estimating the latent structure underlying dynamic event arrivals on a network.
We demonstrate that online inference can obtain comparable performance, in terms of community recovery, to non-online variants.
arXiv Detail & Related papers (2020-09-03T15:39:55Z) - Learning Connectivity of Neural Networks from a Topological Perspective [80.35103711638548]
We propose a topological perspective to represent a network into a complete graph for analysis.
By assigning learnable parameters to the edges which reflect the magnitude of connections, the learning process can be performed in a differentiable manner.
This learning process is compatible with existing networks and owns adaptability to larger search spaces and different tasks.
arXiv Detail & Related papers (2020-08-19T04:53:31Z) - Modeling Dynamic Heterogeneous Network for Link Prediction using
Hierarchical Attention with Temporal RNN [16.362525151483084]
We propose a novel dynamic heterogeneous network embedding method, termed as DyHATR.
It uses hierarchical attention to learn heterogeneous information and incorporates recurrent neural networks with temporal attention to capture evolutionary patterns.
We benchmark our method on four real-world datasets for the task of link prediction.
arXiv Detail & Related papers (2020-04-01T17:16:47Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.