Obtaining Basic Algebra Formulas with Genetic Programming and Functional
Rewriting
- URL: http://arxiv.org/abs/2005.01207v1
- Date: Sun, 3 May 2020 23:32:36 GMT
- Title: Obtaining Basic Algebra Formulas with Genetic Programming and Functional
Rewriting
- Authors: Edwin Camilo Cubides and Jonatan Gomez
- Abstract summary: We use functional programming rewriting to boost inductive genetic programming.
Parents are selected following a tournament selection mechanism and the next population is obtained following a steady-state strategy.
We compare the performance of our technique in a set of hard problems (for classical genetic programming)
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In this paper, we develop a set of genetic programming operators and an
initialization population process based on concepts of functional programming
rewriting for boosting inductive genetic programming. Such genetic operators
are used within a hybrid adaptive evolutionary algorithm that evolves operator
rates at the same time it evolves the solution. Solutions are represented using
recursive functions where genome is encoded as an ordered list of trees and
phenotype is written in a simple functional programming language that uses
rewriting as operational semantic (computational model). The fitness is the
number of examples successfully deduced over the cardinal of the set of
examples. Parents are selected following a tournament selection mechanism and
the next population is obtained following a steady-state strategy. The
evolutionary process can use some previous functions (programs) induced as
background knowledge. We compare the performance of our technique in a set of
hard problems (for classical genetic programming). In particular, we take as
test-bed the problem of obtaining equivalent algebraic expressions of some
notable products (such as square of a binomial, and cube of a binomial), and
the recursive formulas of sum of the first n and squares of the first n natural
numbers.
Related papers
- Guiding Genetic Programming with Graph Neural Networks [0.20718016474717196]
We propose EvoNUDGE, which uses a graph neural network to elicit additional knowledge from symbolic regression problems.
In an extensive experiment on a large number of problem instances, EvoNUDGE is shown to significantly outperform multiple baselines.
arXiv Detail & Related papers (2024-11-03T20:43:31Z) - Analysing the Influence of Reorder Strategies for Cartesian Genetic Programming [0.0]
We introduce three novel operators which reorder the genotype of a graph defined by CGP.
We show that the number of iterations until a solution is found and/or the fitness value improves by using CGP with a reorder method.
However, there is no consistently best performing reorder operator.
arXiv Detail & Related papers (2024-10-01T08:59:01Z) - Benchmarking Differential Evolution on a Quantum Simulator [0.0]
Differential Evolution (DE) can be used to compute the minima of functions such as the rastrigin function and rosenbrock function.
This work is an attempt to study the result of applying the DE method on these functions with candidate individuals generated on classical Turing modeled computation.
arXiv Detail & Related papers (2023-11-06T14:27:00Z) - Symbolic Regression via Neural-Guided Genetic Programming Population
Seeding [6.9501458586819505]
Symbolic regression is a discrete optimization problem generally believed to be NP-hard.
Prior approaches to solving the problem include neural-guided search and genetic programming.
We propose a neural-guided component used to seed the starting population of a random restart genetic programming component.
arXiv Detail & Related papers (2021-10-29T19:26:41Z) - Learning Algebraic Recombination for Compositional Generalization [71.78771157219428]
We propose LeAR, an end-to-end neural model to learn algebraic recombination for compositional generalization.
Key insight is to model the semantic parsing task as a homomorphism between a latent syntactic algebra and a semantic algebra.
Experiments on two realistic and comprehensive compositional generalization demonstrate the effectiveness of our model.
arXiv Detail & Related papers (2021-07-14T07:23:46Z) - Recognizing and Verifying Mathematical Equations using Multiplicative
Differential Neural Units [86.9207811656179]
We show that memory-augmented neural networks (NNs) can achieve higher-order, memory-augmented extrapolation, stable performance, and faster convergence.
Our models achieve a 1.53% average improvement over current state-of-the-art methods in equation verification and achieve a 2.22% Top-1 average accuracy and 2.96% Top-5 average accuracy for equation completion.
arXiv Detail & Related papers (2021-04-07T03:50:11Z) - NaturalProofs: Mathematical Theorem Proving in Natural Language [132.99913141409968]
We develop NaturalProofs, a multi-domain corpus of mathematical statements and their proofs.
NaturalProofs unifies broad coverage, deep coverage, and low-resource mathematical sources.
We benchmark strong neural methods on mathematical reference retrieval and generation tasks.
arXiv Detail & Related papers (2021-03-24T03:14:48Z) - Abelian Neural Networks [48.52497085313911]
We first construct a neural network architecture for Abelian group operations and derive a universal approximation property.
We extend it to Abelian semigroup operations using the characterization of associative symmetrics.
We train our models over fixed word embeddings and demonstrate improved performance over the original word2vec.
arXiv Detail & Related papers (2021-02-24T11:52:21Z) - Genetic optimization algorithms applied toward mission computability
models [0.3655021726150368]
Genetic algorithms are computations based and low cost to compute.
We describe our genetic optimization algorithms to a mission-critical and constraints-aware problem.
arXiv Detail & Related papers (2020-05-27T00:45:24Z) - Automatic Differentiation in ROOT [62.997667081978825]
In mathematics and computer algebra, automatic differentiation (AD) is a set of techniques to evaluate the derivative of a function specified by a computer program.
This paper presents AD techniques available in ROOT, supported by Cling, to produce derivatives of arbitrary C/C++ functions.
arXiv Detail & Related papers (2020-04-09T09:18:50Z) - The data-driven physical-based equations discovery using evolutionary
approach [77.34726150561087]
We describe the algorithm for the mathematical equations discovery from the given observations data.
The algorithm combines genetic programming with the sparse regression.
It could be used for governing analytical equation discovery as well as for partial differential equations (PDE) discovery.
arXiv Detail & Related papers (2020-04-03T17:21:57Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.