Probabilistic Structured Grammatical Evolution
- URL: http://arxiv.org/abs/2205.10685v1
- Date: Sat, 21 May 2022 22:29:10 GMT
- Title: Probabilistic Structured Grammatical Evolution
- Authors: Jessica M\'egane and Nuno Louren\c{c}o and Penousal Machado
- Abstract summary: We propose Probabilistic Structured Grammatical Evolution (PSGE)
PSGE combines the Structured Grammatical Evolution (SGE) and Probabilistic Grammatical Evolution (PGE) representation variants and mapping mechanisms.
PSGE statistically outperformed Grammatical Evolution (GE) on all six benchmark problems studied.
- Score: 0.5156484100374059
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: The grammars used in grammar-based Genetic Programming (GP) methods have a
significant impact on the quality of the solutions generated since they define
the search space by restricting the solutions to its syntax. In this work, we
propose Probabilistic Structured Grammatical Evolution (PSGE), a new approach
that combines the Structured Grammatical Evolution (SGE) and Probabilistic
Grammatical Evolution (PGE) representation variants and mapping mechanisms. The
genotype is a set of dynamic lists, one for each non-terminal in the grammar,
with each element of the list representing a probability used to select the
next Probabilistic Context-Free Grammar (PCFG) derivation rule. PSGE
statistically outperformed Grammatical Evolution (GE) on all six benchmark
problems studied. In comparison to PGE, PSGE outperformed 4 of the 6 problems
analyzed.
Related papers
- VQDNA: Unleashing the Power of Vector Quantization for Multi-Species Genomic Sequence Modeling [60.91599380893732]
VQDNA is a general-purpose framework that renovates genome tokenization from the perspective of genome vocabulary learning.
By leveraging vector-quantized codebooks as learnable vocabulary, VQDNA can adaptively tokenize genomes into pattern-aware embeddings.
arXiv Detail & Related papers (2024-05-13T20:15:03Z) - Efficient and Scalable Fine-Tune of Language Models for Genome
Understanding [49.606093223945734]
We present textscLingo: textscLanguage prefix ftextscIne-tuning for textscGentextscOmes.
Unlike DNA foundation models, textscLingo strategically leverages natural language foundation models' contextual cues.
textscLingo further accommodates numerous downstream fine-tune tasks by an adaptive rank sampling method.
arXiv Detail & Related papers (2024-02-12T21:40:45Z) - Compositional Program Generation for Few-Shot Systematic Generalization [59.57656559816271]
This study on a neuro-symbolic architecture called the Compositional Program Generator (CPG)
CPG has three key features: textitmodularity, textitcomposition, and textitabstraction, in the form of grammar rules.
It perfect achieves generalization on both the SCAN and COGS benchmarks using just 14 examples for SCAN and 22 examples for COGS.
arXiv Detail & Related papers (2023-09-28T14:33:20Z) - From the One, Judge of the Whole: Typed Entailment Graph Construction
with Predicate Generation [69.91691115264132]
Entailment Graphs (EGs) are constructed to indicate context-independent entailment relations in natural languages.
In this paper, we propose a multi-stage method, Typed Predicate-Entailment Graph Generator (TP-EGG) to tackle this problem.
Experiments on benchmark datasets show that TP-EGG can generate high-quality and scale-controllable entailment graphs.
arXiv Detail & Related papers (2023-06-07T05:46:19Z) - Context Matters: Adaptive Mutation for Grammars [2.3577368017815705]
This work proposes Adaptive Facilitated Mutation, a self-adaptive mutation method for Structured Grammatical Evolution (SGE)
In our proposed mutation, each individual contains an array with a different, self-adaptive mutation rate for each non-terminal.
Experiments were conducted on three symbolic regression benchmarks using Probabilistic Structured Grammatical Evolution (PSGE), a variant of SGE.
arXiv Detail & Related papers (2023-03-25T17:26:20Z) - Co-evolutionary Probabilistic Structured Grammatical Evolution [0.5156484100374059]
This work proposes an extension to Structured Grammatical Evolution (SGE) called Co-evolutionary Probabilistic Structured Grammatical Evolution (Co-PSGE)
In Co-PSGE each individual in the population is composed by a grammar and a genotype, which is a list of dynamic lists.
The performance of the proposed approach is compared to 3 different methods, namely, Grammatical Evolution (GE), Probabilistic Grammatical Evolution (PGE), and SGE on four different benchmark problems.
arXiv Detail & Related papers (2022-04-19T16:35:22Z) - Initialisation and Grammar Design in Grammar-Guided Evolutionary
Computation [0.0]
We show that genetic programming (CFG-GP) is less sensitive to initialisation and grammar design than random search and GE.
We also demonstrate that observed cases of poor performance by CFG-GP are managed through simple adjustment of tuning parameters.
arXiv Detail & Related papers (2022-04-15T10:15:40Z) - Learning of Structurally Unambiguous Probabilistic Grammars [7.347989843033034]
We show that a CMTA can be converted into a probabilistic grammar.
We show that the learned CMTA can be converted into a complete algorithm for learning a structurally unambiguous probabilistic context free grammar.
arXiv Detail & Related papers (2022-03-17T17:01:51Z) - Grounded Graph Decoding Improves Compositional Generalization in
Question Answering [68.72605660152101]
Question answering models struggle to generalize to novel compositions of training patterns, such as longer sequences or more complex test structures.
We propose Grounded Graph Decoding, a method to improve compositional generalization of language representations by grounding structured predictions with an attention mechanism.
Our model significantly outperforms state-of-the-art baselines on the Compositional Freebase Questions (CFQ) dataset, a challenging benchmark for compositional generalization in question answering.
arXiv Detail & Related papers (2021-11-05T17:50:14Z) - A Syntax-Guided Grammatical Error Correction Model with Dependency Tree
Correction [83.14159143179269]
Grammatical Error Correction (GEC) is a task of detecting and correcting grammatical errors in sentences.
We propose a syntax-guided GEC model (SG-GEC) which adopts the graph attention mechanism to utilize the syntactic knowledge of dependency trees.
We evaluate our model on public benchmarks of GEC task and it achieves competitive results.
arXiv Detail & Related papers (2021-11-05T07:07:48Z) - Probabilistic Grammatical Evolution [0.6445605125467573]
We propose Probabilistic Grammatical Evolution (PGE) to address some of its main issues and improve its performance.
We resort to a Probabilistic Context-Free Grammar (PCFG) where its probabilities are adapted during the evolutionary process.
We evaluate the performance of PGE in two regression problems and compare it with GE and Structured Grammatical Evolution (SGE)
arXiv Detail & Related papers (2021-03-15T13:54:26Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.