HHNAS-AM: Hierarchical Hybrid Neural Architecture Search using Adaptive Mutation Policies
- URL: http://arxiv.org/abs/2508.14946v1
- Date: Wed, 20 Aug 2025 09:56:32 GMT
- Title: HHNAS-AM: Hierarchical Hybrid Neural Architecture Search using Adaptive Mutation Policies
- Authors: Anurag Tripathi, Ajeet Kumar Singh, Rajsabi Surya, Aum Gupta, Sahiinii Lemaina Veikho, Dorien Herremans, Sudhir Bisane,
- Abstract summary: We propose HHNAS-AM, a novel approach that efficiently explores diverse architectural configurations.<n>Our method employs mutation strategies that dynamically adapt based on performance feedback from previous iterations.<n>We evaluate our approach on the database id (db_id) prediction task, where it consistently discovers high-performing architectures.
- Score: 5.689917817957284
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Neural Architecture Search (NAS) has garnered significant research interest due to its capability to discover architectures superior to manually designed ones. Learning text representation is crucial for text classification and other language-related tasks. The NAS model used in text classification does not have a Hybrid hierarchical structure, and there is no restriction on the architecture structure, due to which the search space becomes very large and mostly redundant, so the existing RL models are not able to navigate the search space effectively. Also, doing a flat architecture search leads to an unorganised search space, which is difficult to traverse. For this purpose, we propose HHNAS-AM (Hierarchical Hybrid Neural Architecture Search with Adaptive Mutation Policies), a novel approach that efficiently explores diverse architectural configurations. We introduce a few architectural templates to search on which organise the search spaces, where search spaces are designed on the basis of domain-specific cues. Our method employs mutation strategies that dynamically adapt based on performance feedback from previous iterations using Q-learning, enabling a more effective and accelerated traversal of the search space. The proposed model is fully probabilistic, enabling effective exploration of the search space. We evaluate our approach on the database id (db_id) prediction task, where it consistently discovers high-performing architectures across multiple experiments. On the Spider dataset, our method achieves an 8% improvement in test accuracy over existing baselines.
Related papers
- Evolutionary Architecture Search through Grammar-Based Sequence Alignment [8.631577300185961]
We introduce two adapted variants of the Smith-Waterman algorithm for local sequence alignment and use them to compute the edit distance in a grammar-based evolutionary architecture search.<n>We highlight how our method vastly improves computational complexity over previous work and enables us to efficiently compute shortest paths between architectures.<n>Future work can build upon this new tool, discovering novel components that can be used more broadly across neural architecture design, and broadening its applications beyond NAS.
arXiv Detail & Related papers (2025-12-04T16:57:49Z) - Transferrable Surrogates in Expressive Neural Architecture Search Spaces [20.539222762754054]
We investigate surrogate model training for improving search in expressive NAS search spaces based on context-free grammars.<n>We show that i) surrogate models trained either using zero-cost-proxy metrics and neural graph features (GRAF) or by fine-tuning an off-the-shelf LM have high predictive power for the performance of architectures both within and across datasets.
arXiv Detail & Related papers (2025-04-17T14:22:28Z) - Federated Neural Architecture Search with Model-Agnostic Meta Learning [7.542593703407386]
Federated Neural Architecture Search (NAS) enables collaborative search for optimal model architectures tailored to heterogeneous data to achieve higher accuracy.<n>We introduce FedMetaNAS, a framework that integrates meta-learning with NAS within the Federated Learning context.<n>We show that FedMetaNAS significantly accelerates the search process by more than 50% with higher accuracy compared to FedNAS.
arXiv Detail & Related papers (2025-04-08T21:57:40Z) - Construction of Hierarchical Neural Architecture Search Spaces based on
Context-free Grammars [66.05096551112932]
We introduce a unifying search space design framework based on context-free grammars.
By enhancing and using their properties, we effectively enable search over the complete architecture.
We show that our search strategy can be superior to existing Neural Architecture Search approaches.
arXiv Detail & Related papers (2022-11-03T14:23:00Z) - Searching a High-Performance Feature Extractor for Text Recognition
Network [92.12492627169108]
We design a domain-specific search space by exploring principles for having good feature extractors.
As the space is huge and complexly structured, no existing NAS algorithms can be applied.
We propose a two-stage algorithm to effectively search in the space.
arXiv Detail & Related papers (2022-09-27T03:49:04Z) - One-Shot Neural Ensemble Architecture Search by Diversity-Guided Search
Space Shrinking [97.60915598958968]
We propose a one-shot neural ensemble architecture search (NEAS) solution that addresses the two challenges.
For the first challenge, we introduce a novel diversity-based metric to guide search space shrinking.
For the second challenge, we enable a new search dimension to learn layer sharing among different models for efficiency purposes.
arXiv Detail & Related papers (2021-04-01T16:29:49Z) - Unchain the Search Space with Hierarchical Differentiable Architecture
Search [42.32368267716705]
DAS-based methods mainly focus on searching for a repeatable cell structure, which is then stacked sequentially in multiple stages to form the networks.
We propose a Hierarchical Differentiable Architecture Search (H-DAS) that performs architecture search both at the cell level and at the stage level.
For the stage-level search, we systematically study the architectures of stages, including the number of cells in each stage and the connections between the cells.
arXiv Detail & Related papers (2021-01-11T17:01:43Z) - AutoRC: Improving BERT Based Relation Classification Models via
Architecture Search [50.349407334562045]
BERT based relation classification (RC) models have achieved significant improvements over the traditional deep learning models.
No consensus can be reached on what is the optimal architecture.
We design a comprehensive search space for BERT based RC models and employ neural architecture search (NAS) method to automatically discover the design choices.
arXiv Detail & Related papers (2020-09-22T16:55:49Z) - Breaking the Curse of Space Explosion: Towards Efficient NAS with
Curriculum Search [94.46818035655943]
We propose a curriculum search method that starts from a small search space and gradually incorporates the learned knowledge to guide the search in a large space.
With the proposed search strategy, our Curriculum Neural Architecture Search (CNAS) method significantly improves the search efficiency and finds better architectures than existing NAS methods.
arXiv Detail & Related papers (2020-07-07T02:29:06Z) - AutoSTR: Efficient Backbone Search for Scene Text Recognition [80.7290173000068]
Scene text recognition (STR) is very challenging due to the diversity of text instances and the complexity of scenes.
We propose automated STR (AutoSTR) to search data-dependent backbones to boost text recognition performance.
Experiments demonstrate that, by searching data-dependent backbones, AutoSTR can outperform the state-of-the-art approaches on standard benchmarks.
arXiv Detail & Related papers (2020-03-14T06:51:04Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.