Deep Learning-Based Operators for Evolutionary Algorithms
- URL: http://arxiv.org/abs/2407.10477v1
- Date: Mon, 15 Jul 2024 07:05:34 GMT
- Title: Deep Learning-Based Operators for Evolutionary Algorithms
- Authors: Eliad Shem-Tov, Moshe Sipper, Achiya Elyasaf,
- Abstract summary: We present two novel domain-independent genetic operators that harness the capabilities of deep learning: a crossover operator for genetic algorithms and a mutation operator for genetic programming.
- Score: 1.7751300245073598
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We present two novel domain-independent genetic operators that harness the capabilities of deep learning: a crossover operator for genetic algorithms and a mutation operator for genetic programming. Deep Neural Crossover leverages the capabilities of deep reinforcement learning and an encoder-decoder architecture to select offspring genes. BERT mutation masks multiple gp-tree nodes and then tries to replace these masks with nodes that will most likely improve the individual's fitness. We show the efficacy of both operators through experimentation.
Related papers
- MeGA: Merging Multiple Independently Trained Neural Networks Based on Genetic Algorithm [0.0]
We introduce a novel method for merging the weights of multiple pre-trained neural networks using a genetic algorithm called MeGA.
Our approach leverages a genetic algorithm with tournament selection, crossover, and mutation to optimize weight combinations, creating a more effective fusion.
arXiv Detail & Related papers (2024-06-07T03:31:58Z) - Deep Neural Crossover [1.9950682531209156]
We present a novel multi-parent crossover operator in genetic algorithms (GAs) called Deep Neural Crossover'' (DNC)
Unlike conventional GA crossover operators that rely on a random selection of parental genes, DNC leverages the capabilities of deep reinforcement learning (DRL) and an encoder-decoder architecture to select the genes.
DNC is domain-independent and can be easily applied to other problem domains.
arXiv Detail & Related papers (2024-03-17T09:50:20Z) - Genetic heterogeneity analysis using genetic algorithm and network
science [2.6166087473624318]
Genome-wide association studies (GWAS) can identify disease susceptible genetic variables.
Genetic variables intertwined with genetic effects often exhibit lower effect-size.
This paper introduces a novel feature selection mechanism for GWAS, named Feature Co-selection Network (FCSNet)
arXiv Detail & Related papers (2023-08-12T01:28:26Z) - Permutation Equivariant Neural Functionals [92.0667671999604]
This work studies the design of neural networks that can process the weights or gradients of other neural networks.
We focus on the permutation symmetries that arise in the weights of deep feedforward networks because hidden layer neurons have no inherent order.
In our experiments, we find that permutation equivariant neural functionals are effective on a diverse set of tasks.
arXiv Detail & Related papers (2023-02-27T18:52:38Z) - DeepProphet2 -- A Deep Learning Gene Recommendation Engine [0.0]
The paper discusses the potential advantages of gene recommendation performed by artificial intelligence (AI)
A transformer-based model has been trained on a well-curated freely available paper corpus, PubMed.
A set of use cases illustrates the algorithm's potential applications in a real word setting.
arXiv Detail & Related papers (2022-08-03T08:54:13Z) - Improving RNA Secondary Structure Design using Deep Reinforcement
Learning [69.63971634605797]
We propose a new benchmark of applying reinforcement learning to RNA sequence design, in which the objective function is defined to be the free energy in the sequence's secondary structure.
We show results of the ablation analysis that we do for these algorithms, as well as graphs indicating the algorithm's performance across batches.
arXiv Detail & Related papers (2021-11-05T02:54:06Z) - Epigenetic evolution of deep convolutional models [81.21462458089142]
We build upon a previously proposed neuroevolution framework to evolve deep convolutional models.
We propose a convolutional layer layout which allows kernels of different shapes and sizes to coexist within the same layer.
The proposed layout enables the size and shape of individual kernels within a convolutional layer to be evolved with a corresponding new mutation operator.
arXiv Detail & Related papers (2021-04-12T12:45:16Z) - Exploiting Heterogeneity in Operational Neural Networks by Synaptic
Plasticity [87.32169414230822]
Recently proposed network model, Operational Neural Networks (ONNs), can generalize the conventional Convolutional Neural Networks (CNNs)
In this study the focus is drawn on searching the best-possible operator set(s) for the hidden neurons of the network based on the Synaptic Plasticity paradigm that poses the essential learning theory in biological neurons.
Experimental results over highly challenging problems demonstrate that the elite ONNs even with few neurons and layers can achieve a superior learning performance than GIS-based ONNs.
arXiv Detail & Related papers (2020-08-21T19:03:23Z) - Self-Organized Operational Neural Networks with Generative Neurons [87.32169414230822]
ONNs are heterogenous networks with a generalized neuron model that can encapsulate any set of non-linear operators.
We propose Self-organized ONNs (Self-ONNs) with generative neurons that have the ability to adapt (optimize) the nodal operator of each connection.
arXiv Detail & Related papers (2020-04-24T14:37:56Z) - New mechanism of combination crossover operators in genetic algorithm
for solving the traveling salesman problem [2.578242050187029]
We propose two new crossover operators and new mechanism of combination crossover operators in genetic algorithm for solving TSP.
Experimental results show that, our proposed algorithm is better than the Genetic algorithm (GA) using MSCX on the min, mean cost values.
arXiv Detail & Related papers (2020-01-14T13:20:44Z) - Alpha Discovery Neural Network based on Prior Knowledge [55.65102700986668]
Genetic programming (GP) is the state-of-the-art in financial automated feature construction task.
This paper proposes Alpha Discovery Neural Network (ADNN), a tailored neural network structure which can automatically construct diversified financial technical indicators.
arXiv Detail & Related papers (2019-12-26T03:10:17Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.