Neural Architecture Search of SPD Manifold Networks
- URL: http://arxiv.org/abs/2010.14535v4
- Date: Sun, 13 Jun 2021 21:24:32 GMT
- Title: Neural Architecture Search of SPD Manifold Networks
- Authors: Rhea Sanjay Sukthanker, Zhiwu Huang, Suryansh Kumar, Erik Goron
Endsjo, Yan Wu, Luc Van Gool
- Abstract summary: We propose a new neural architecture search (NAS) problem of Symmetric Positive Definite (SPD) manifold networks.
We first introduce a geometrically rich and diverse SPD neural architecture search space for an efficient SPD cell design.
We exploit a differentiable NAS algorithm on our relaxed continuous search space for SPD neural architecture search.
- Score: 79.45110063435617
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In this paper, we propose a new neural architecture search (NAS) problem of
Symmetric Positive Definite (SPD) manifold networks, aiming to automate the
design of SPD neural architectures. To address this problem, we first introduce
a geometrically rich and diverse SPD neural architecture search space for an
efficient SPD cell design. Further, we model our new NAS problem with a
one-shot training process of a single supernet. Based on the supernet modeling,
we exploit a differentiable NAS algorithm on our relaxed continuous search
space for SPD neural architecture search. Statistical evaluation of our method
on drone, action, and emotion recognition tasks mostly provides better results
than the state-of-the-art SPD networks and traditional NAS algorithms.
Empirical results show that our algorithm excels in discovering better
performing SPD network design and provides models that are more than three
times lighter than searched by the state-of-the-art NAS algorithms.
Related papers
- DNA Family: Boosting Weight-Sharing NAS with Block-Wise Supervisions [121.05720140641189]
We develop a family of models with the distilling neural architecture (DNA) techniques.
Our proposed DNA models can rate all architecture candidates, as opposed to previous works that can only access a sub- search space using algorithms.
Our models achieve state-of-the-art top-1 accuracy of 78.9% and 83.6% on ImageNet for a mobile convolutional network and a small vision transformer, respectively.
arXiv Detail & Related papers (2024-03-02T22:16:47Z) - DCP-NAS: Discrepant Child-Parent Neural Architecture Search for 1-bit
CNNs [53.82853297675979]
1-bit convolutional neural networks (CNNs) with binary weights and activations show their potential for resource-limited embedded devices.
One natural approach is to use 1-bit CNNs to reduce the computation and memory cost of NAS.
We introduce Discrepant Child-Parent Neural Architecture Search (DCP-NAS) to efficiently search 1-bit CNNs.
arXiv Detail & Related papers (2023-06-27T11:28:29Z) - GPT-NAS: Evolutionary Neural Architecture Search with the Generative Pre-Trained Model [25.187467297581073]
This work presents a novel architecture search algorithm, called GPT-NAS, that optimize neural architectures by Generative Pre-Trained (GPT) model.
In GPT-NAS, we assume that a generative model pre-trained on a large-scale corpus could learn the fundamental law of building neural architectures.
Our GPT-NAS method significantly outperforms seven manually designed neural architectures and thirteen architectures provided by competing NAS methods.
arXiv Detail & Related papers (2023-05-09T11:29:42Z) - SpiderNet: Hybrid Differentiable-Evolutionary Architecture Search via
Train-Free Metrics [0.0]
Neural Architecture Search (NAS) algorithms are intended to remove the burden of manual neural network design.
NAS algorithms require a variety of design parameters in the form of user configuration or hard-coded decisions which limit the variety of networks that can be discovered.
We present SpiderNet, a hybrid differentiable-evolutionary and hardware-aware algorithm that rapidly and efficiently produces state-of-the-art networks.
arXiv Detail & Related papers (2022-04-20T08:55:01Z) - D-DARTS: Distributed Differentiable Architecture Search [75.12821786565318]
Differentiable ARchiTecture Search (DARTS) is one of the most trending Neural Architecture Search (NAS) methods.
We propose D-DARTS, a novel solution that addresses this problem by nesting several neural networks at cell-level.
arXiv Detail & Related papers (2021-08-20T09:07:01Z) - Poisoning the Search Space in Neural Architecture Search [0.0]
We evaluate the robustness of one such algorithm known as Efficient NAS against data poisoning attacks on the original search space.
Our results provide insights into the challenges to surmount in using NAS for more adversarially robust architecture search.
arXiv Detail & Related papers (2021-06-28T05:45:57Z) - Search to aggregate neighborhood for graph neural network [47.47628113034479]
We propose a framework, which tries to Search to Aggregate NEighborhood (SANE) to automatically design data-specific GNN architectures.
By designing a novel and expressive search space, we propose a differentiable search algorithm, which is more efficient than previous reinforcement learning based methods.
arXiv Detail & Related papers (2021-04-14T03:15:19Z) - MS-RANAS: Multi-Scale Resource-Aware Neural Architecture Search [94.80212602202518]
We propose Multi-Scale Resource-Aware Neural Architecture Search (MS-RANAS)
We employ a one-shot architecture search approach in order to obtain a reduced search cost.
We achieve state-of-the-art results in terms of accuracy-speed trade-off.
arXiv Detail & Related papers (2020-09-29T11:56:01Z) - NPENAS: Neural Predictor Guided Evolution for Neural Architecture Search [9.038625856798227]
We propose a neural predictor guided evolutionary algorithm to enhance the exploration ability of EA for Neural architecture search (NAS)
NPENAS-BO and NPENAS-NP outperform most existing NAS algorithms.
arXiv Detail & Related papers (2020-03-28T17:56:31Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.