A First Step Towards Runtime Analysis of Evolutionary Neural Architecture Search
- URL: http://arxiv.org/abs/2401.11712v2
- Date: Mon, 8 Apr 2024 12:36:54 GMT
- Title: A First Step Towards Runtime Analysis of Evolutionary Neural Architecture Search
- Authors: Zeqiong Lv, Chao Qian, Yanan Sun,
- Abstract summary: This work goes preliminary steps toward the mathematical runtime analysis of ENAS.
We define a binary classification problem $textscUNIFORM$, and formulate an explicit fitness function to represent the relationship between neural architecture and classification accuracy.
The theoretical results show that the local and global mutations achieve nearly the same performance on $textscUNIFORM$.
- Score: 24.056523078277053
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Evolutionary neural architecture search (ENAS) employs evolutionary algorithms to find high-performing neural architectures automatically, and has achieved great success. However, compared to the empirical success, its rigorous theoretical analysis has yet to be touched. This work goes preliminary steps toward the mathematical runtime analysis of ENAS. In particular, we define a binary classification problem $\textsc{UNIFORM}$, and formulate an explicit fitness function to represent the relationship between neural architecture and classification accuracy. Furthermore, we consider (1+1)-ENAS algorithm with mutation to optimize the neural architecture, and obtain the following runtime bounds: both the local and global mutations find the optimum in an expected runtime of $\Theta(n)$, where $n$ is the problem size. The theoretical results show that the local and global mutations achieve nearly the same performance on $\textsc{UNIFORM}$. Empirical results also verify the equivalence of these two mutation operators.
Related papers
- Ecological Neural Architecture Search [0.0]
This paper introduces Neuvo Ecological Neural Architecture Search (ENAS)
ENAS incorporates evolutionary parameters directly into the candidate solutions' phenotypes, allowing them to evolve dynamically alongside architecture specifications.
arXiv Detail & Related papers (2025-03-13T21:40:25Z) - A Pairwise Comparison Relation-assisted Multi-objective Evolutionary Neural Architecture Search Method with Multi-population Mechanism [58.855741970337675]
Neural architecture search (NAS) enables re-searchers to automatically explore vast search spaces and find efficient neural networks.
NAS suffers from a key bottleneck, i.e., numerous architectures need to be evaluated during the search process.
We propose the SMEM-NAS, a pairwise com-parison relation-assisted multi-objective evolutionary algorithm based on a multi-population mechanism.
arXiv Detail & Related papers (2024-07-22T12:46:22Z) - First Steps Towards a Runtime Analysis of Neuroevolution [2.07180164747172]
We consider a simple setting in neuroevolution where an evolutionary algorithm optimize the weights and activation functions of a simple artificial neural network.
We then define simple example functions to be learned by the network and conduct rigorous runtime analyses for networks with a single neuron and for a more advanced structure with several neurons and two layers.
Our results show that the proposed algorithm is generally efficient on two example problems designed for one neuron and efficient with at least constant probability on the example problem for a two-layer network.
arXiv Detail & Related papers (2023-07-03T07:30:58Z) - DCP-NAS: Discrepant Child-Parent Neural Architecture Search for 1-bit
CNNs [53.82853297675979]
1-bit convolutional neural networks (CNNs) with binary weights and activations show their potential for resource-limited embedded devices.
One natural approach is to use 1-bit CNNs to reduce the computation and memory cost of NAS.
We introduce Discrepant Child-Parent Neural Architecture Search (DCP-NAS) to efficiently search 1-bit CNNs.
arXiv Detail & Related papers (2023-06-27T11:28:29Z) - Continuous Cartesian Genetic Programming based representation for
Multi-Objective Neural Architecture Search [12.545742558041583]
We propose a novel approach for designing less complex yet highly effective convolutional neural networks (CNNs)
Our approach combines real-based and block-chained CNNs representations based on cartesian genetic programming (CGP) for neural architecture search (NAS)
Two variants are introduced that differ in the granularity of the search space they consider.
arXiv Detail & Related papers (2023-06-05T07:32:47Z) - Fractal Structure and Generalization Properties of Stochastic
Optimization Algorithms [71.62575565990502]
We prove that the generalization error of an optimization algorithm can be bounded on the complexity' of the fractal structure that underlies its generalization measure.
We further specialize our results to specific problems (e.g., linear/logistic regression, one hidden/layered neural networks) and algorithms.
arXiv Detail & Related papers (2021-06-09T08:05:36Z) - Neural Architecture Search based on Cartesian Genetic Programming Coding
Method [6.519170476143571]
We propose an evolutionary approach of NAS based on CGP, called CGPNAS, to solve sentence classification task.
The experimental results show that the searched architectures are comparable with the performance of human-designed architectures.
arXiv Detail & Related papers (2021-03-12T09:51:03Z) - Fast Learning of Graph Neural Networks with Guaranteed Generalizability:
One-hidden-layer Case [93.37576644429578]
Graph neural networks (GNNs) have made great progress recently on learning from graph-structured data in practice.
We provide a theoretically-grounded generalizability analysis of GNNs with one hidden layer for both regression and binary classification problems.
arXiv Detail & Related papers (2020-06-25T00:45:52Z) - DC-NAS: Divide-and-Conquer Neural Architecture Search [108.57785531758076]
We present a divide-and-conquer (DC) approach to effectively and efficiently search deep neural architectures.
We achieve a $75.1%$ top-1 accuracy on the ImageNet dataset, which is higher than that of state-of-the-art methods using the same search space.
arXiv Detail & Related papers (2020-05-29T09:02:16Z) - A Semi-Supervised Assessor of Neural Architectures [157.76189339451565]
We employ an auto-encoder to discover meaningful representations of neural architectures.
A graph convolutional neural network is introduced to predict the performance of architectures.
arXiv Detail & Related papers (2020-05-14T09:02:33Z) - A Generic Graph-based Neural Architecture Encoding Scheme for
Predictor-based NAS [18.409809742204896]
This work proposes a novel Graph-based neural ArchiTecture Scheme, a.k.a. a GATES, to improve the predictor-based neural architecture search.
Gates models the operations as the transformation of the propagating information, which mimics the actual data processing of neural architecture.
arXiv Detail & Related papers (2020-04-04T09:54:49Z) - NPENAS: Neural Predictor Guided Evolution for Neural Architecture Search [9.038625856798227]
We propose a neural predictor guided evolutionary algorithm to enhance the exploration ability of EA for Neural architecture search (NAS)
NPENAS-BO and NPENAS-NP outperform most existing NAS algorithms.
arXiv Detail & Related papers (2020-03-28T17:56:31Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.