Redefining Neural Architecture Search of Heterogeneous Multi-Network
Models by Characterizing Variation Operators and Model Components
- URL: http://arxiv.org/abs/2106.08972v2
- Date: Wed, 17 Aug 2022 17:38:04 GMT
- Title: Redefining Neural Architecture Search of Heterogeneous Multi-Network
Models by Characterizing Variation Operators and Model Components
- Authors: Unai Garciarena, Roberto Santana, Alexander Mendiburu
- Abstract summary: We investigate the effect of different variation operators in a complex domain, that of multi-network heterogeneous neural models.
We characterize both the variation operators, according to their effect on the complexity and performance of the model; and the models, relying on diverse metrics which estimate the quality of the different parts composing it.
- Score: 71.03032589756434
- License: http://creativecommons.org/licenses/by-sa/4.0/
- Abstract: With neural architecture search methods gaining ground on manually designed
deep neural networks -even more rapidly as model sophistication escalates-, the
research trend shifts towards arranging different and often increasingly
complex neural architecture search spaces. In this conjuncture, delineating
algorithms which can efficiently explore these search spaces can result in a
significant improvement over currently used methods, which, in general,
randomly select the structural variation operator, hoping for a performance
gain. In this paper, we investigate the effect of different variation operators
in a complex domain, that of multi-network heterogeneous neural models. These
models have an extensive and complex search space of structures as they require
multiple sub-networks within the general model in order to answer to different
output types. From that investigation, we extract a set of general guidelines,
whose application is not limited to that particular type of model, and are
useful to determine the direction in which an architecture optimization method
could find the largest improvement. To deduce the set of guidelines, we
characterize both the variation operators, according to their effect on the
complexity and performance of the model; and the models, relying on diverse
metrics which estimate the quality of the different parts composing it.
Related papers
- Multiobjective Evolutionary Pruning of Deep Neural Networks with
Transfer Learning for improving their Performance and Robustness [15.29595828816055]
This work proposes MO-EvoPruneDeepTL, a multi-objective evolutionary pruning algorithm.
We use Transfer Learning to adapt the last layers of Deep Neural Networks, by replacing them with sparse layers evolved by a genetic algorithm.
Experiments show that our proposal achieves promising results in all the objectives, and direct relation are presented.
arXiv Detail & Related papers (2023-02-20T19:33:38Z) - Amortized Inference for Causal Structure Learning [72.84105256353801]
Learning causal structure poses a search problem that typically involves evaluating structures using a score or independence test.
We train a variational inference model to predict the causal structure from observational/interventional data.
Our models exhibit robust generalization capabilities under substantial distribution shift.
arXiv Detail & Related papers (2022-05-25T17:37:08Z) - Learning Interpretable Models Through Multi-Objective Neural
Architecture Search [0.9990687944474739]
We propose a framework to optimize for both task performance and "introspectability," a surrogate metric for aspects of interpretability.
We demonstrate that jointly optimizing for task error and introspectability leads to more disentangled and debuggable architectures that perform within error.
arXiv Detail & Related papers (2021-12-16T05:50:55Z) - Pareto-wise Ranking Classifier for Multi-objective Evolutionary Neural
Architecture Search [15.454709248397208]
This study focuses on how to find feasible deep models under diverse design objectives.
We propose a classification-wise Pareto evolution approach for one-shot NAS, where an online classifier is trained to predict the dominance relationship between the candidate and constructed reference architectures.
We find a number of neural architectures with different model sizes ranging from 2M to 6M under diverse objectives and constraints.
arXiv Detail & Related papers (2021-09-14T13:28:07Z) - Rethinking Architecture Selection in Differentiable NAS [74.61723678821049]
Differentiable Neural Architecture Search is one of the most popular NAS methods for its search efficiency and simplicity.
We propose an alternative perturbation-based architecture selection that directly measures each operation's influence on the supernet.
We find that several failure modes of DARTS can be greatly alleviated with the proposed selection method.
arXiv Detail & Related papers (2021-08-10T00:53:39Z) - Model-agnostic multi-objective approach for the evolutionary discovery
of mathematical models [55.41644538483948]
In modern data science, it is more interesting to understand the properties of the model, which parts could be replaced to obtain better results.
We use multi-objective evolutionary optimization for composite data-driven model learning to obtain the algorithm's desired properties.
arXiv Detail & Related papers (2021-07-07T11:17:09Z) - AutoAdapt: Automated Segmentation Network Search for Unsupervised Domain
Adaptation [4.793219747021116]
We perform neural architecture search (NAS) to provide architecture-level perspective and analysis for domain adaptation.
We propose bridging this gap by using maximum mean discrepancy and regional weighted entropy to estimate the accuracy metric.
arXiv Detail & Related papers (2021-06-24T17:59:02Z) - Differentiable Neural Architecture Search with Morphism-based
Transformable Backbone Architectures [35.652234989200956]
This study aims at making the architecture search process more adaptive for one-shot or online training.
It introduces a growing mechanism for differentiable neural architecture search based on network morphism.
We also implement a recently proposed two-input backbone architecture for recurrent neural networks.
arXiv Detail & Related papers (2021-06-14T07:56:33Z) - On the Exploitation of Neuroevolutionary Information: Analyzing the Past
for a More Efficient Future [60.99717891994599]
We propose an approach that extracts information from neuroevolutionary runs, and use it to build a metamodel.
We inspect the best structures found during neuroevolutionary searches of generative adversarial networks with varying characteristics.
arXiv Detail & Related papers (2021-05-26T20:55:29Z) - Automated Search for Resource-Efficient Branched Multi-Task Networks [81.48051635183916]
We propose a principled approach, rooted in differentiable neural architecture search, to automatically define branching structures in a multi-task neural network.
We show that our approach consistently finds high-performing branching structures within limited resource budgets.
arXiv Detail & Related papers (2020-08-24T09:49:19Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.