Neural Architecture Search on Acoustic Scene Classification
- URL: http://arxiv.org/abs/1912.12825v2
- Date: Wed, 5 Aug 2020 04:58:06 GMT
- Title: Neural Architecture Search on Acoustic Scene Classification
- Authors: Jixiang Li, Chuming Liang, Bo Zhang, Zhao Wang, Fei Xiang, Xiangxiang
Chu
- Abstract summary: We propose a lightweight yet high-performing baseline network inspired by MobileNetV2.
We explore a dynamic architecture space built on the basis of the proposed baseline.
Experimental results demonstrate that our searched network is competent in ASC tasks.
- Score: 13.529070650030313
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Convolutional neural networks are widely adopted in Acoustic Scene
Classification (ASC) tasks, but they generally carry a heavy computational
burden. In this work, we propose a lightweight yet high-performing baseline
network inspired by MobileNetV2, which replaces square convolutional kernels
with unidirectional ones to extract features alternately in temporal and
frequency dimensions. Furthermore, we explore a dynamic architecture space
built on the basis of the proposed baseline with the recent Neural Architecture
Search (NAS) paradigm, which first trains a supernet that incorporates all
candidate networks and then applies a well-known evolutionary algorithm NSGA-II
to discover more efficient networks with higher accuracy and lower
computational cost. Experimental results demonstrate that our searched network
is competent in ASC tasks, which achieves 90.3% F1-score on the DCASE2018 task
5 evaluation set, marking a new state-of-the-art performance while saving 25%
of FLOPs compared to our baseline network.
Related papers
- TS-ENAS:Two-Stage Evolution for Cell-based Network Architecture Search [3.267963071384687]
We propose a Two-Stage Evolution for cell-based Network Architecture Search (TS-ENAS)
In our algorithm, a new cell-based search space and an effective two-stage encoding method are designed to represent cells and neural network structures.
The experimental results show that TS-ENAS can more effectively find the neural network architecture with comparative performance.
arXiv Detail & Related papers (2023-10-14T08:02:01Z) - A General-Purpose Transferable Predictor for Neural Architecture Search [22.883809911265445]
We propose a general-purpose neural predictor for Neural Architecture Search (NAS) that can transfer across search spaces.
Experimental results on NAS-Bench-101, 201 and 301 demonstrate the efficacy of our scheme.
arXiv Detail & Related papers (2023-02-21T17:28:05Z) - Evolutionary Neural Cascade Search across Supernetworks [68.8204255655161]
We introduce ENCAS - Evolutionary Neural Cascade Search.
ENCAS can be used to search over multiple pretrained supernetworks.
We test ENCAS on common computer vision benchmarks.
arXiv Detail & Related papers (2022-03-08T11:06:01Z) - SAR-NAS: Skeleton-based Action Recognition via Neural Architecture
Searching [18.860051578038608]
We encode a skeleton-based action instance into a tensor and define a set of operations to build two types of network cells: normal cells and reduction cells.
Experiments on the challenging NTU RGB+D and Kinectics datasets have verified that most of the networks developed to date for skeleton-based action recognition are likely not compact and efficient.
The proposed method provides an approach to search for such a compact network that is able to achieve comparative or even better performance than the state-of-the-art methods.
arXiv Detail & Related papers (2020-10-29T03:24:15Z) - Hierarchical Neural Architecture Search for Deep Stereo Matching [131.94481111956853]
We propose the first end-to-end hierarchical NAS framework for deep stereo matching.
Our framework incorporates task-specific human knowledge into the neural architecture search framework.
It is ranked at the top 1 accuracy on KITTI stereo 2012, 2015 and Middlebury benchmarks, as well as the top 1 on SceneFlow dataset.
arXiv Detail & Related papers (2020-10-26T11:57:37Z) - EagerNet: Early Predictions of Neural Networks for Computationally
Efficient Intrusion Detection [2.223733768286313]
We propose a new architecture to detect network attacks with minimal resources.
The architecture is able to deal with either binary or multiclass classification problems and trades prediction speed for the accuracy of the network.
arXiv Detail & Related papers (2020-07-27T11:31:37Z) - FBNetV3: Joint Architecture-Recipe Search using Predictor Pretraining [65.39532971991778]
We present an accuracy predictor that scores architecture and training recipes jointly, guiding both sample selection and ranking.
We run fast evolutionary searches in just CPU minutes to generate architecture-recipe pairs for a variety of resource constraints.
FBNetV3 makes up a family of state-of-the-art compact neural networks that outperform both automatically and manually-designed competitors.
arXiv Detail & Related papers (2020-06-03T05:20:21Z) - DC-NAS: Divide-and-Conquer Neural Architecture Search [108.57785531758076]
We present a divide-and-conquer (DC) approach to effectively and efficiently search deep neural architectures.
We achieve a $75.1%$ top-1 accuracy on the ImageNet dataset, which is higher than that of state-of-the-art methods using the same search space.
arXiv Detail & Related papers (2020-05-29T09:02:16Z) - Binarizing MobileNet via Evolution-based Searching [66.94247681870125]
We propose a use of evolutionary search to facilitate the construction and training scheme when binarizing MobileNet.
Inspired by one-shot architecture search frameworks, we manipulate the idea of group convolution to design efficient 1-Bit Convolutional Neural Networks (CNNs)
Our objective is to come up with a tiny yet efficient binary neural architecture by exploring the best candidates of the group convolution.
arXiv Detail & Related papers (2020-05-13T13:25:51Z) - Network Adjustment: Channel Search Guided by FLOPs Utilization Ratio [101.84651388520584]
This paper presents a new framework named network adjustment, which considers network accuracy as a function of FLOPs.
Experiments on standard image classification datasets and a wide range of base networks demonstrate the effectiveness of our approach.
arXiv Detail & Related papers (2020-04-06T15:51:00Z) - Fast Neural Network Adaptation via Parameter Remapping and Architecture
Search [35.61441231491448]
Deep neural networks achieve remarkable performance in many computer vision tasks.
Most state-of-the-art (SOTA) semantic segmentation and object detection approaches reuse neural network architectures designed for image classification as the backbone.
One major challenge though, is that ImageNet pre-training of the search space representation incurs huge computational cost.
In this paper, we propose a Fast Neural Network Adaptation (FNA) method, which can adapt both the architecture and parameters of a seed network.
arXiv Detail & Related papers (2020-01-08T13:45:15Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.