IDENAS: Internal Dependency Exploration for Neural Architecture Search
- URL: http://arxiv.org/abs/2310.17250v1
- Date: Thu, 26 Oct 2023 08:58:29 GMT
- Title: IDENAS: Internal Dependency Exploration for Neural Architecture Search
- Authors: Anh T. Hoang, Zsolt J. Viharos
- Abstract summary: Internal Dependency-based Exploration for Neural Architecture Search (NAS) and Feature Selection have emerged as promising solutions in such scenarios.
This research proposes IDENAS, an Internal Dependency-based Exploration for Neural Architecture Search, integrating NAS with feature selection.
The methodology explores internal dependencies in the complete parameter space for classification involving 1D sensor and 2D image data as well.
- Score: 0.0
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Machine learning is a powerful tool for extracting valuable information and
making various predictions from diverse datasets. Traditional algorithms rely
on well-defined input and output variables however, there are scenarios where
the distinction between the input and output variables and the underlying,
associated (input and output) layers of the model, are unknown. Neural
Architecture Search (NAS) and Feature Selection have emerged as promising
solutions in such scenarios. This research proposes IDENAS, an Internal
Dependency-based Exploration for Neural Architecture Search, integrating NAS
with feature selection. The methodology explores internal dependencies in the
complete parameter space for classification involving 1D sensor and 2D image
data as well. IDENAS employs a modified encoder-decoder model and the
Sequential Forward Search (SFS) algorithm, combining input-output configuration
search with embedded feature selection. Experimental results demonstrate
IDENASs superior performance in comparison to other algorithms, showcasing its
effectiveness in model development pipelines and automated machine learning. On
average, IDENAS achieved significant modelling improvements, underscoring its
significant contribution to advancing the state-of-the-art in neural
architecture search and feature selection integration.
Related papers
- A Pairwise Comparison Relation-assisted Multi-objective Evolutionary Neural Architecture Search Method with Multi-population Mechanism [58.855741970337675]
Neural architecture search (NAS) enables re-searchers to automatically explore vast search spaces and find efficient neural networks.
NAS suffers from a key bottleneck, i.e., numerous architectures need to be evaluated during the search process.
We propose the SMEM-NAS, a pairwise com-parison relation-assisted multi-objective evolutionary algorithm based on a multi-population mechanism.
arXiv Detail & Related papers (2024-07-22T12:46:22Z) - DNA Family: Boosting Weight-Sharing NAS with Block-Wise Supervisions [121.05720140641189]
We develop a family of models with the distilling neural architecture (DNA) techniques.
Our proposed DNA models can rate all architecture candidates, as opposed to previous works that can only access a sub- search space using algorithms.
Our models achieve state-of-the-art top-1 accuracy of 78.9% and 83.6% on ImageNet for a mobile convolutional network and a small vision transformer, respectively.
arXiv Detail & Related papers (2024-03-02T22:16:47Z) - Multi-objective Differentiable Neural Architecture Search [58.67218773054753]
We propose a novel NAS algorithm that encodes user preferences for the trade-off between performance and hardware metrics.
Our method outperforms existing MOO NAS methods across a broad range of qualitatively different search spaces and datasets.
arXiv Detail & Related papers (2024-02-28T10:09:04Z) - Evolution and Efficiency in Neural Architecture Search: Bridging the Gap Between Expert Design and Automated Optimization [1.7385545432331702]
The paper provides a comprehensive overview of Neural Architecture Search.
It emphasizes its evolution from manual design to automated, computationally-driven approaches.
It highlights its application across various domains, including medical imaging and natural language processing.
arXiv Detail & Related papers (2024-02-11T18:27:29Z) - POPNASv3: a Pareto-Optimal Neural Architecture Search Solution for Image
and Time Series Classification [8.190723030003804]
This article presents the third version of a sequential model-based NAS algorithm targeting different hardware environments and multiple classification tasks.
Our method is able to find competitive architectures within large search spaces, while keeping a flexible structure and data processing pipeline to adapt to different tasks.
The experiments performed on images and time series classification datasets provide evidence that POPNASv3 can explore a large set of assorted operators and converge to optimal architectures suited for the type of data provided under different scenarios.
arXiv Detail & Related papers (2022-12-13T17:14:14Z) - EmotionNAS: Two-stream Neural Architecture Search for Speech Emotion
Recognition [48.71010404625924]
We propose a two-stream neural architecture search framework, called enquoteEmotionNAS.
Specifically, we take two-stream features (i.e., handcrafted and deep features) as the inputs, followed by NAS to search for the optimal structure for each stream.
Experimental results demonstrate that our method outperforms existing manually-designed and NAS-based models.
arXiv Detail & Related papers (2022-03-25T12:35:44Z) - Towards Tailored Models on Private AIoT Devices: Federated Direct Neural
Architecture Search [22.69123714900226]
We propose a Federated Direct Neural Architecture Search (FDNAS) framework that allows for hardware-friendly NAS from non- IID data across devices.
Experiments on non-IID datasets have shown the state-of-the-art accuracy-efficiency trade-offs achieved by the proposed solution.
arXiv Detail & Related papers (2022-02-23T13:10:01Z) - Differentiable NAS Framework and Application to Ads CTR Prediction [30.74403362212425]
We implement an inference and modular framework for Differentiable Neural Architecture Search (DNAS)
We apply DNAS to the problem of ads click-through rate (CTR) prediction, arguably the highest-value and most worked on AI problem at hyperscalers today.
We develop and tailor novel search spaces to a Deep Learning Recommendation Model (DLRM) backbone for CTR prediction, and report state-of-the-art results on the Criteo Kaggle CTR prediction dataset.
arXiv Detail & Related papers (2021-10-25T05:46:27Z) - DrNAS: Dirichlet Neural Architecture Search [88.56953713817545]
We treat the continuously relaxed architecture mixing weight as random variables, modeled by Dirichlet distribution.
With recently developed pathwise derivatives, the Dirichlet parameters can be easily optimized with gradient-based generalization.
To alleviate the large memory consumption of differentiable NAS, we propose a simple yet effective progressive learning scheme.
arXiv Detail & Related papers (2020-06-18T08:23:02Z) - Progressive Automatic Design of Search Space for One-Shot Neural
Architecture Search [15.017964136568061]
It has been observed that a model with higher one-shot model accuracy does not necessarily perform better when stand-alone trained.
We propose Progressive Automatic Design of search space, named PAD-NAS.
In this way, PAD-NAS can automatically design the operations for each layer and achieve a trade-off between search space quality and model diversity.
arXiv Detail & Related papers (2020-05-15T14:21:07Z) - NAS-Count: Counting-by-Density with Neural Architecture Search [74.92941571724525]
We automate the design of counting models with Neural Architecture Search (NAS)
We introduce an end-to-end searched encoder-decoder architecture, Automatic Multi-Scale Network (AMSNet)
arXiv Detail & Related papers (2020-02-29T09:18:17Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.