MorphoNAS: Embryogenic Neural Architecture Search Through Morphogen-Guided Development
- URL: http://arxiv.org/abs/2507.13785v1
- Date: Fri, 18 Jul 2025 09:57:49 GMT
- Title: MorphoNAS: Embryogenic Neural Architecture Search Through Morphogen-Guided Development
- Authors: Mykola Glybovets, Sergii Medvid,
- Abstract summary: We introduce MorphoNAS (Morphogenetic Neural Architecture Search), a system able to deterministically grow neural networks.<n>In MorphoNAS, simple genomes encode just morphogens dynamics and threshold-based rules of cellular development.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: While biological neural networks develop from compact genomes using relatively simple rules, modern artificial neural architecture search methods mostly involve explicit and routine manual work. In this paper, we introduce MorphoNAS (Morphogenetic Neural Architecture Search), a system able to deterministically grow neural networks through morphogenetic self-organization inspired by the Free Energy Principle, reaction-diffusion systems, and gene regulatory networks. In MorphoNAS, simple genomes encode just morphogens dynamics and threshold-based rules of cellular development. Nevertheless, this leads to self-organization of a single progenitor cell into complex neural networks, while the entire process is built on local chemical interactions. Our evolutionary experiments focused on two different domains: structural targeting, in which MorphoNAS system was able to find fully successful genomes able to generate predefined random graph configurations (8-31 nodes); and functional performance on the CartPole control task achieving low complexity 6-7 neuron solutions when target network size minimization evolutionary pressure was applied. The evolutionary process successfully balanced between quality of of the final solutions and neural architecture search effectiveness. Overall, our findings suggest that the proposed MorphoNAS method is able to grow complex specific neural architectures, using simple developmental rules, which suggests a feasible biological route to adaptive and efficient neural architecture search.
Related papers
- Evolution imposes an inductive bias that alters and accelerates learning dynamics [49.1574468325115]
We investigate the effect of evolutionary optimization on the learning dynamics of neural networks.<n>We combined algorithms natural selection and online learning to produce a method for evolutionarily conditioning artificial neural networks.<n>Results suggest evolution constitutes an inductive bias that tunes neural systems to enable rapid learning.
arXiv Detail & Related papers (2025-05-15T18:50:57Z) - Enhancing learning in spiking neural networks through neuronal heterogeneity and neuromodulatory signaling [52.06722364186432]
We propose a biologically-informed framework for enhancing artificial neural networks (ANNs)
Our proposed dual-framework approach highlights the potential of spiking neural networks (SNNs) for emulating diverse spiking behaviors.
We outline how the proposed approach integrates brain-inspired compartmental models and task-driven SNNs, bioinspiration and complexity.
arXiv Detail & Related papers (2024-07-05T14:11:28Z) - Graph Neural Networks for Learning Equivariant Representations of Neural Networks [55.04145324152541]
We propose to represent neural networks as computational graphs of parameters.
Our approach enables a single model to encode neural computational graphs with diverse architectures.
We showcase the effectiveness of our method on a wide range of tasks, including classification and editing of implicit neural representations.
arXiv Detail & Related papers (2024-03-18T18:01:01Z) - Wet TinyML: Chemical Neural Network Using Gene Regulation and Cell
Plasticity [5.9659016963634315]
Wet TinyML is a form of chemical-based neural network based on gene regulatory network.
GRNNs can be used for conventional computing by employing an application-based search process.
We show that through the directed cell plasticity, we can extract the mathematical regression evolution enabling it to match to dynamic system applications.
arXiv Detail & Related papers (2024-03-13T14:00:18Z) - Brain-Inspired Machine Intelligence: A Survey of
Neurobiologically-Plausible Credit Assignment [65.268245109828]
We examine algorithms for conducting credit assignment in artificial neural networks that are inspired or motivated by neurobiology.
We organize the ever-growing set of brain-inspired learning schemes into six general families and consider these in the context of backpropagation of errors.
The results of this review are meant to encourage future developments in neuro-mimetic systems and their constituent learning processes.
arXiv Detail & Related papers (2023-12-01T05:20:57Z) - Brain-inspired Evolutionary Architectures for Spiking Neural Networks [6.607406750195899]
We explore efficient architectural optimization for Spiking Neural Networks (SNNs)
This paper evolves SNNs architecture by incorporating brain-inspired local modular structure and global cross- module connectivity.
We introduce an efficient multi-objective evolutionary algorithm based on a few-shot performance predictor, endowing SNNs with high performance, efficiency and low energy consumption.
arXiv Detail & Related papers (2023-09-11T06:39:11Z) - A Hybrid Neural Coding Approach for Pattern Recognition with Spiking
Neural Networks [53.31941519245432]
Brain-inspired spiking neural networks (SNNs) have demonstrated promising capabilities in solving pattern recognition tasks.
These SNNs are grounded on homogeneous neurons that utilize a uniform neural coding for information representation.
In this study, we argue that SNN architectures should be holistically designed to incorporate heterogeneous coding schemes.
arXiv Detail & Related papers (2023-05-26T02:52:12Z) - Contrastive-Signal-Dependent Plasticity: Self-Supervised Learning in Spiking Neural Circuits [61.94533459151743]
This work addresses the challenge of designing neurobiologically-motivated schemes for adjusting the synapses of spiking networks.
Our experimental simulations demonstrate a consistent advantage over other biologically-plausible approaches when training recurrent spiking networks.
arXiv Detail & Related papers (2023-03-30T02:40:28Z) - An Artificial Neural Network Functionalized by Evolution [2.0625936401496237]
We propose a hybrid model which combines the tensor calculus of feed-forward neural networks with Pseudo-Darwinian mechanisms.
This allows for finding topologies that are well adapted for elaboration of strategies, control problems or pattern recognition tasks.
In particular, the model can provide adapted topologies at early evolutionary stages, and'structural convergence', which can found applications in robotics, big-data and artificial life.
arXiv Detail & Related papers (2022-05-16T14:49:58Z) - On the Exploitation of Neuroevolutionary Information: Analyzing the Past
for a More Efficient Future [60.99717891994599]
We propose an approach that extracts information from neuroevolutionary runs, and use it to build a metamodel.
We inspect the best structures found during neuroevolutionary searches of generative adversarial networks with varying characteristics.
arXiv Detail & Related papers (2021-05-26T20:55:29Z) - Neural Architecture Search based on Cartesian Genetic Programming Coding
Method [6.519170476143571]
We propose an evolutionary approach of NAS based on CGP, called CGPNAS, to solve sentence classification task.
The experimental results show that the searched architectures are comparable with the performance of human-designed architectures.
arXiv Detail & Related papers (2021-03-12T09:51:03Z) - A multi-agent model for growing spiking neural networks [0.0]
This project has explored rules for growing the connections between the neurons in Spiking Neural Networks as a learning mechanism.
Results in a simulation environment showed that for a given set of parameters it is possible to reach topologies that reproduce the tested functions.
This project also opens the door to the usage of techniques like genetic algorithms for obtaining the best suited values for the model parameters.
arXiv Detail & Related papers (2020-09-21T15:11:29Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.