Conceptual Expansion Neural Architecture Search (CENAS)
- URL: http://arxiv.org/abs/2110.03144v1
- Date: Thu, 7 Oct 2021 02:29:26 GMT
- Title: Conceptual Expansion Neural Architecture Search (CENAS)
- Authors: Mohan Singamsetti, Anmol Mahajan and Matthew Guzdial
- Abstract summary: We present an approach called Conceptual Expansion Neural Architecture Search (CENAS)
It combines a sample-efficient, computational creativity-inspired transfer learning approach with neural architecture search.
It finds models faster than naive architecture search via transferring existing weights to approximate the parameters of the new model.
- Score: 1.3464152928754485
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Architecture search optimizes the structure of a neural network for some task
instead of relying on manual authoring. However, it is slow, as each potential
architecture is typically trained from scratch. In this paper we present an
approach called Conceptual Expansion Neural Architecture Search (CENAS) that
combines a sample-efficient, computational creativity-inspired transfer
learning approach with neural architecture search. This approach finds models
faster than naive architecture search via transferring existing weights to
approximate the parameters of the new model. It outperforms standard transfer
learning by allowing for the addition of features instead of only modifying
existing features. We demonstrate that our approach outperforms standard neural
architecture search and transfer learning methods in terms of efficiency,
performance, and parameter counts on a variety of transfer learning tasks.
Related papers
- EM-DARTS: Hierarchical Differentiable Architecture Search for Eye Movement Recognition [54.99121380536659]
Eye movement biometrics have received increasing attention thanks to its high secure identification.
Deep learning (DL) models have been recently successfully applied for eye movement recognition.
DL architecture still is determined by human prior knowledge.
We propose EM-DARTS, a hierarchical differentiable architecture search algorithm to automatically design the DL architecture for eye movement recognition.
arXiv Detail & Related papers (2024-09-22T13:11:08Z) - Design Principle Transfer in Neural Architecture Search via Large Language Models [37.004026595537006]
Transferable neural architecture search (TNAS) has been introduced to design efficient neural architectures for multiple tasks.
In TNAS, architectural knowledge accumulated in previous search processes is reused to warm up the architecture search for new tasks.
This work proposes a novel transfer paradigm, i.e., design principle transfer.
arXiv Detail & Related papers (2024-08-21T04:27:44Z) - Training-free Neural Architecture Search for RNNs and Transformers [0.0]
We develop a new training-free metric, named hidden covariance, that predicts the trained performance of an RNN architecture.
We find that the current search space paradigm for transformer architectures is not optimized for training-free neural architecture search.
arXiv Detail & Related papers (2023-06-01T02:06:13Z) - Proxyless Neural Architecture Adaptation for Supervised Learning and
Self-Supervised Learning [3.766702945560518]
We propose proxyless neural architecture adaptation that is reproducible and efficient.
Our method can be applied to both supervised learning and self-supervised learning.
arXiv Detail & Related papers (2022-05-15T02:49:48Z) - Neural Architecture Search for Speech Emotion Recognition [72.1966266171951]
We propose to apply neural architecture search (NAS) techniques to automatically configure the SER models.
We show that NAS can improve SER performance (54.89% to 56.28%) while maintaining model parameter sizes.
arXiv Detail & Related papers (2022-03-31T10:16:10Z) - Network Graph Based Neural Architecture Search [57.78724765340237]
We search neural network by rewiring the corresponding graph and predict the architecture performance by graph properties.
Because we do not perform machine learning over the entire graph space, the searching process is remarkably efficient.
arXiv Detail & Related papers (2021-12-15T00:12:03Z) - Rethinking Architecture Selection in Differentiable NAS [74.61723678821049]
Differentiable Neural Architecture Search is one of the most popular NAS methods for its search efficiency and simplicity.
We propose an alternative perturbation-based architecture selection that directly measures each operation's influence on the supernet.
We find that several failure modes of DARTS can be greatly alleviated with the proposed selection method.
arXiv Detail & Related papers (2021-08-10T00:53:39Z) - Differentiable Neural Architecture Search with Morphism-based
Transformable Backbone Architectures [35.652234989200956]
This study aims at making the architecture search process more adaptive for one-shot or online training.
It introduces a growing mechanism for differentiable neural architecture search based on network morphism.
We also implement a recently proposed two-input backbone architecture for recurrent neural networks.
arXiv Detail & Related papers (2021-06-14T07:56:33Z) - NAS-DIP: Learning Deep Image Prior with Neural Architecture Search [65.79109790446257]
Recent work has shown that the structure of deep convolutional neural networks can be used as a structured image prior.
We propose to search for neural architectures that capture stronger image priors.
We search for an improved network by leveraging an existing neural architecture search algorithm.
arXiv Detail & Related papers (2020-08-26T17:59:36Z) - A Semi-Supervised Assessor of Neural Architectures [157.76189339451565]
We employ an auto-encoder to discover meaningful representations of neural architectures.
A graph convolutional neural network is introduced to predict the performance of architectures.
arXiv Detail & Related papers (2020-05-14T09:02:33Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.