Combining Neuroevolution with the Search for Novelty to Improve the Generation of Test Inputs for Games
- URL: http://arxiv.org/abs/2407.04985v1
- Date: Sat, 6 Jul 2024 07:36:44 GMT
- Title: Combining Neuroevolution with the Search for Novelty to Improve the Generation of Test Inputs for Games
- Authors: Patric Feldmeier, Gordon Fraser,
- Abstract summary: As games challenge traditional automated white-box test generators, Neatest generates test suites consisting of neural networks that exercise the source code by playing the games.
We investigate whether the issue of challenging fitness landscapes can be addressed by promoting novel behaviours during the search.
- Score: 9.465831527115235
- License: http://creativecommons.org/licenses/by-sa/4.0/
- Abstract: As games challenge traditional automated white-box test generators, the Neatest approach generates test suites consisting of neural networks that exercise the source code by playing the games. Neatest generates these neural networks using an evolutionary algorithm that is guided by an objective function targeting individual source code statements. This approach works well if the objective function provides sufficient guidance, but deceiving or complex fitness landscapes may inhibit the search. In this paper, we investigate whether the issue of challenging fitness landscapes can be addressed by promoting novel behaviours during the search. Our case study on two Scratch games demonstrates that rewarding novel behaviours is a promising approach for overcoming challenging fitness landscapes, thus enabling future research on how to adapt the search algorithms to best use this information.
Related papers
- Many-Objective Neuroevolution for Testing Games [8.422309223970302]
Test generator NEATEST tackles challenges by combining search-based software testing principles with neuroevolution.
We transform NEATEST into a many-objective search algorithm that targets several program states simultaneously.
Our experiments show that extending NEATEST to target several objectives simultaneously increases the average branch coverage from 75.88% to 81.33%.
arXiv Detail & Related papers (2025-01-14T09:18:34Z) - Life, uh, Finds a Way: Systematic Neural Search [2.163881720692685]
We tackle the challenge of rapidly adapting an agent's behavior to solve continuous problems in settings.
Instead of focusing on deep reinforcement learning, we propose viewing behavior as the physical manifestation of a search procedure.
We describe an algorithm that implicitly enumerates behaviors by regulating the tight feedback loop between execution of behaviors and mutation of the graph.
arXiv Detail & Related papers (2024-10-02T09:06:54Z) - Assaying on the Robustness of Zero-Shot Machine-Generated Text Detectors [57.7003399760813]
We explore advanced Large Language Models (LLMs) and their specialized variants, contributing to this field in several ways.
We uncover a significant correlation between topics and detection performance.
These investigations shed light on the adaptability and robustness of these detection methods across diverse topics.
arXiv Detail & Related papers (2023-12-20T10:53:53Z) - The Clock and the Pizza: Two Stories in Mechanistic Explanation of
Neural Networks [59.26515696183751]
We show that algorithm discovery in neural networks is sometimes more complex.
We show that even simple learning problems can admit a surprising diversity of solutions.
arXiv Detail & Related papers (2023-06-30T17:59:13Z) - CorpusBrain: Pre-train a Generative Retrieval Model for
Knowledge-Intensive Language Tasks [62.22920673080208]
Single-step generative model can dramatically simplify the search process and be optimized in end-to-end manner.
We name the pre-trained generative retrieval model as CorpusBrain as all information about the corpus is encoded in its parameters without the need of constructing additional index.
arXiv Detail & Related papers (2022-08-16T10:22:49Z) - Improving exploration in policy gradient search: Application to symbolic
optimization [6.344988093245026]
Many machine learning strategies leverage neural networks to search large spaces of mathematical symbols.
In contrast to traditional evolutionary approaches, using a neural network at the core of the search allows learning higher-level symbolic patterns.
We show that these techniques can improve the performance, increase sample efficiency, and lower the complexity of solutions for the task of symbolic regression.
arXiv Detail & Related papers (2021-07-19T21:11:07Z) - AdaLead: A simple and robust adaptive greedy search algorithm for
sequence design [55.41644538483948]
We develop an easy-to-directed, scalable, and robust evolutionary greedy algorithm (AdaLead)
AdaLead is a remarkably strong benchmark that out-competes more complex state of the art approaches in a variety of biologically motivated sequence design challenges.
arXiv Detail & Related papers (2020-10-05T16:40:38Z) - Binary Neural Networks: A Survey [126.67799882857656]
The binary neural network serves as a promising technique for deploying deep models on resource-limited devices.
The binarization inevitably causes severe information loss, and even worse, its discontinuity brings difficulty to the optimization of the deep network.
We present a survey of these algorithms, mainly categorized into the native solutions directly conducting binarization, and the optimized ones using techniques like minimizing the quantization error, improving the network loss function, and reducing the gradient error.
arXiv Detail & Related papers (2020-03-31T16:47:20Z) - Meta-learning curiosity algorithms [26.186627089223624]
We formulate the problem of generating curious behavior as one of meta-learning.
Our rich language of programs combines neural networks with other building blocks such as buffers, nearest-neighbor modules and custom loss functions.
We find two novel curiosity algorithms that perform on par or better than human-designed published curiosity algorithms in domains as disparate as grid navigation with image inputs, acrobot, lunar lander, ant and hopper.
arXiv Detail & Related papers (2020-03-11T14:25:43Z) - AutoML-Zero: Evolving Machine Learning Algorithms From Scratch [76.83052807776276]
We show that it is possible to automatically discover complete machine learning algorithms just using basic mathematical operations as building blocks.
We demonstrate this by introducing a novel framework that significantly reduces human bias through a generic search space.
We believe these preliminary successes in discovering machine learning algorithms from scratch indicate a promising new direction in the field.
arXiv Detail & Related papers (2020-03-06T19:00:04Z) - Neuroevolution of Neural Network Architectures Using CoDeepNEAT and
Keras [0.0]
A large portion of the work involved in a machine learning project is to define the best type of algorithm to solve a given problem.
Finding the optimal network topology and configurations for a given problem is a challenge that requires domain knowledge and testing efforts.
arXiv Detail & Related papers (2020-02-11T19:03:34Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.