Co-evolutionary Probabilistic Structured Grammatical Evolution
- URL: http://arxiv.org/abs/2204.08985v1
- Date: Tue, 19 Apr 2022 16:35:22 GMT
- Title: Co-evolutionary Probabilistic Structured Grammatical Evolution
- Authors: Jessica M\'egane and Nuno Louren\c{c}o and Penousal Machado
- Abstract summary: This work proposes an extension to Structured Grammatical Evolution (SGE) called Co-evolutionary Probabilistic Structured Grammatical Evolution (Co-PSGE)
In Co-PSGE each individual in the population is composed by a grammar and a genotype, which is a list of dynamic lists.
The performance of the proposed approach is compared to 3 different methods, namely, Grammatical Evolution (GE), Probabilistic Grammatical Evolution (PGE), and SGE on four different benchmark problems.
- Score: 0.5156484100374059
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: This work proposes an extension to Structured Grammatical Evolution (SGE)
called Co-evolutionary Probabilistic Structured Grammatical Evolution
(Co-PSGE). In Co-PSGE each individual in the population is composed by a
grammar and a genotype, which is a list of dynamic lists, each corresponding to
a non-terminal of the grammar containing real numbers that correspond to the
probability of choosing a derivation rule. Each individual uses its own grammar
to map the genotype into a program. During the evolutionary process, both the
grammar and the genotype are subject to variation operators. The performance of
the proposed approach is compared to 3 different methods, namely, Grammatical
Evolution (GE), Probabilistic Grammatical Evolution (PGE), and SGE on four
different benchmark problems. The results show the effectiveness of the
approach since Co-PSGE is able to outperform all the methods with statistically
significant differences in the majority of the problems.
Related papers
- VQDNA: Unleashing the Power of Vector Quantization for Multi-Species Genomic Sequence Modeling [60.91599380893732]
VQDNA is a general-purpose framework that renovates genome tokenization from the perspective of genome vocabulary learning.
By leveraging vector-quantized codebooks as learnable vocabulary, VQDNA can adaptively tokenize genomes into pattern-aware embeddings.
arXiv Detail & Related papers (2024-05-13T20:15:03Z) - Efficient and Scalable Fine-Tune of Language Models for Genome
Understanding [49.606093223945734]
We present textscLingo: textscLanguage prefix ftextscIne-tuning for textscGentextscOmes.
Unlike DNA foundation models, textscLingo strategically leverages natural language foundation models' contextual cues.
textscLingo further accommodates numerous downstream fine-tune tasks by an adaptive rank sampling method.
arXiv Detail & Related papers (2024-02-12T21:40:45Z) - Context Matters: Adaptive Mutation for Grammars [2.3577368017815705]
This work proposes Adaptive Facilitated Mutation, a self-adaptive mutation method for Structured Grammatical Evolution (SGE)
In our proposed mutation, each individual contains an array with a different, self-adaptive mutation rate for each non-terminal.
Experiments were conducted on three symbolic regression benchmarks using Probabilistic Structured Grammatical Evolution (PSGE), a variant of SGE.
arXiv Detail & Related papers (2023-03-25T17:26:20Z) - Equivariant Transduction through Invariant Alignment [71.45263447328374]
We introduce a novel group-equivariant architecture that incorporates a group-in hard alignment mechanism.
We find that our network's structure allows it to develop stronger equivariant properties than existing group-equivariant approaches.
We additionally find that it outperforms previous group-equivariant networks empirically on the SCAN task.
arXiv Detail & Related papers (2022-09-22T11:19:45Z) - Probabilistic Structured Grammatical Evolution [0.5156484100374059]
We propose Probabilistic Structured Grammatical Evolution (PSGE)
PSGE combines the Structured Grammatical Evolution (SGE) and Probabilistic Grammatical Evolution (PGE) representation variants and mapping mechanisms.
PSGE statistically outperformed Grammatical Evolution (GE) on all six benchmark problems studied.
arXiv Detail & Related papers (2022-05-21T22:29:10Z) - Graph Adaptive Semantic Transfer for Cross-domain Sentiment
Classification [68.06496970320595]
Cross-domain sentiment classification (CDSC) aims to use the transferable semantics learned from the source domain to predict the sentiment of reviews in the unlabeled target domain.
We present Graph Adaptive Semantic Transfer (GAST) model, an adaptive syntactic graph embedding method that is able to learn domain-invariant semantics from both word sequences and syntactic graphs.
arXiv Detail & Related papers (2022-05-18T07:47:01Z) - Top-N: Equivariant set and graph generation without exchangeability [61.24699600833916]
We consider one-shot probabilistic decoders that map a vector-shaped prior to a distribution over sets or graphs.
These functions can be integrated into variational autoencoders (VAE), generative adversarial networks (GAN) or normalizing flows.
Top-n is a deterministic, non-exchangeable set creation mechanism which learns to select the most relevant points from a trainable reference set.
arXiv Detail & Related papers (2021-10-05T14:51:19Z) - Commutative Lie Group VAE for Disentanglement Learning [96.32813624341833]
We view disentanglement learning as discovering an underlying structure that equivariantly reflects the factorized variations shown in data.
A simple model named Commutative Lie Group VAE is introduced to realize the group-based disentanglement learning.
Experiments show that our model can effectively learn disentangled representations without supervision, and can achieve state-of-the-art performance without extra constraints.
arXiv Detail & Related papers (2021-06-07T07:03:14Z) - Probabilistic Grammatical Evolution [0.6445605125467573]
We propose Probabilistic Grammatical Evolution (PGE) to address some of its main issues and improve its performance.
We resort to a Probabilistic Context-Free Grammar (PCFG) where its probabilities are adapted during the evolutionary process.
We evaluate the performance of PGE in two regression problems and compare it with GE and Structured Grammatical Evolution (SGE)
arXiv Detail & Related papers (2021-03-15T13:54:26Z) - On the Performance of Metaheuristics: A Different Perspective [0.0]
We study some basic evolutionary and swam-intelligence metaheuristics i.e. Genetic Algorithm (GA), Particle Swarm Optimization (PSO), Artificial Bee Colony (ABC), Teaching-Learning-Based Optimization (TLBO) and Cuckoo Optimization algorithm (COA)
A large number of experiments have been conducted on 20 different optimization benchmark functions with different characteristics, and the results reveal to us some fundamental conclusions besides the following ranking order among these metaheuristics.
arXiv Detail & Related papers (2020-01-24T09:34:10Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.