HKNAS: Classification of Hyperspectral Imagery Based on Hyper Kernel
Neural Architecture Search
- URL: http://arxiv.org/abs/2304.11701v1
- Date: Sun, 23 Apr 2023 17:27:40 GMT
- Title: HKNAS: Classification of Hyperspectral Imagery Based on Hyper Kernel
Neural Architecture Search
- Authors: Di Wang, Bo Du, Liangpei Zhang, and Dacheng Tao
- Abstract summary: We propose to directly generate structural parameters by utilizing the specifically designed hyper kernels.
We obtain three kinds of networks to separately conduct pixel-level or image-level classifications with 1-D or 3-D convolutions.
A series of experiments on six public datasets demonstrate that the proposed methods achieve state-of-the-art results.
- Score: 104.45426861115972
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Recent neural architecture search (NAS) based approaches have made great
progress in hyperspectral image (HSI) classification tasks. However, the
architectures are usually optimized independently of the network weights,
increasing searching time and restricting model performances. To tackle these
issues, in this paper, different from previous methods that extra define
structural parameters, we propose to directly generate structural parameters by
utilizing the specifically designed hyper kernels, ingeniously converting the
original complex dual optimization problem into easily implemented one-tier
optimizations, and greatly shrinking searching costs. Then, we develop a
hierarchical multi-module search space whose candidate operations only contain
convolutions, and these operations can be integrated into unified kernels.
Using the above searching strategy and searching space, we obtain three kinds
of networks to separately conduct pixel-level or image-level classifications
with 1-D or 3-D convolutions. In addition, by combining the proposed hyper
kernel searching scheme with the 3-D convolution decomposition mechanism, we
obtain diverse architectures to simulate 3-D convolutions, greatly improving
network flexibilities. A series of quantitative and qualitative experiments on
six public datasets demonstrate that the proposed methods achieve
state-of-the-art results compared with other advanced NAS-based HSI
classification approaches.
Related papers
- Real-Time Image Segmentation via Hybrid Convolutional-Transformer Architecture Search [49.81353382211113]
We address the challenge of integrating multi-head self-attention into high resolution representation CNNs efficiently.
We develop a multi-target multi-branch supernet method, which fully utilizes the advantages of high-resolution features.
We present a series of model via Hybrid Convolutional-Transformer Architecture Search (HyCTAS) method that searched for the best hybrid combination of light-weight convolution layers and memory-efficient self-attention layers.
arXiv Detail & Related papers (2024-03-15T15:47:54Z) - HASA: Hybrid Architecture Search with Aggregation Strategy for
Echinococcosis Classification and Ovary Segmentation in Ultrasound Images [0.0]
We propose a hybrid NAS framework for ultrasound (US) image classification and segmentation.
Our method can generate more powerful and lightweight models for the above US image classification and segmentation tasks.
arXiv Detail & Related papers (2022-04-14T01:43:00Z) - Triple-level Model Inferred Collaborative Network Architecture for Video
Deraining [43.06607185181434]
We develop a model-guided triple-level optimization framework to deduce network architecture with cooperating optimization and auto-searching mechanism.
Our model shows significant improvements in fidelity and temporal consistency over the state-of-the-art works.
arXiv Detail & Related papers (2021-11-08T13:09:00Z) - iDARTS: Differentiable Architecture Search with Stochastic Implicit
Gradients [75.41173109807735]
Differentiable ARchiTecture Search (DARTS) has recently become the mainstream of neural architecture search (NAS)
We tackle the hypergradient computation in DARTS based on the implicit function theorem.
We show that the architecture optimisation with the proposed method, named iDARTS, is expected to converge to a stationary point.
arXiv Detail & Related papers (2021-06-21T00:44:11Z) - One-Shot Neural Ensemble Architecture Search by Diversity-Guided Search
Space Shrinking [97.60915598958968]
We propose a one-shot neural ensemble architecture search (NEAS) solution that addresses the two challenges.
For the first challenge, we introduce a novel diversity-based metric to guide search space shrinking.
For the second challenge, we enable a new search dimension to learn layer sharing among different models for efficiency purposes.
arXiv Detail & Related papers (2021-04-01T16:29:49Z) - Trilevel Neural Architecture Search for Efficient Single Image
Super-Resolution [127.92235484598811]
This paper proposes a trilevel neural architecture search (NAS) method for efficient single image super-resolution (SR)
For modeling the discrete search space, we apply a new continuous relaxation on the discrete search spaces to build a hierarchical mixture of network-path, cell-operations, and kernel-width.
An efficient search algorithm is proposed to perform optimization in a hierarchical supernet manner.
arXiv Detail & Related papers (2021-01-17T12:19:49Z) - 3D-ANAS: 3D Asymmetric Neural Architecture Search for Fast Hyperspectral
Image Classification [5.727964191623458]
Hyperspectral images involve abundant spectral and spatial information, playing an irreplaceable role in land-cover classification.
Recently, based on deep learning technologies, an increasing number of HSI classification approaches have been proposed, which demonstrate promising performance.
Previous studies suffer from two major drawbacks: 1) the architecture of most deep learning models is manually designed, relies on specialized knowledge, and is relatively tedious.
arXiv Detail & Related papers (2021-01-12T04:15:40Z) - AlphaGAN: Fully Differentiable Architecture Search for Generative
Adversarial Networks [15.740179244963116]
Generative Adversarial Networks (GANs) are formulated as minimax game problems, whereby generators attempt to approach real data distributions by virtue of adversarial learning against discriminators.
In this work, we aim to boost model learning from the perspective of network architectures, by incorporating recent progress on automated architecture search into GANs.
We propose a fully differentiable search framework for generative adversarial networks, dubbed alphaGAN.
arXiv Detail & Related papers (2020-06-16T13:27:30Z) - DC-NAS: Divide-and-Conquer Neural Architecture Search [108.57785531758076]
We present a divide-and-conquer (DC) approach to effectively and efficiently search deep neural architectures.
We achieve a $75.1%$ top-1 accuracy on the ImageNet dataset, which is higher than that of state-of-the-art methods using the same search space.
arXiv Detail & Related papers (2020-05-29T09:02:16Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.