NAS-TC: Neural Architecture Search on Temporal Convolutions for Complex
Action Recognition
- URL: http://arxiv.org/abs/2104.01110v1
- Date: Wed, 17 Mar 2021 02:02:11 GMT
- Title: NAS-TC: Neural Architecture Search on Temporal Convolutions for Complex
Action Recognition
- Authors: Pengzhen Ren, Gang Xiao, Xiaojun Chang, Yun Xiao, Zhihui Li, and
Xiaojiang Chen
- Abstract summary: We propose a new processing framework called Neural Architecture Search- Temporal Convolutional (NAS-TC)
In the first phase, the classical CNN network is used as the backbone network to complete the computationally intensive feature extraction task.
In the second stage, a simple stitching search to the cell is used to complete the relatively lightweight long-range temporal-dependent information extraction.
- Score: 45.168746142597946
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: In the field of complex action recognition in videos, the quality of the
designed model plays a crucial role in the final performance. However,
artificially designed network structures often rely heavily on the researchers'
knowledge and experience. Accordingly, because of the automated design of its
network structure, Neural architecture search (NAS) has achieved great success
in the image processing field and attracted substantial research attention in
recent years. Although some NAS methods have reduced the number of GPU search
days required to single digits in the image field, directly using 3D
convolution to extend NAS to the video field is still likely to produce a surge
in computing volume. To address this challenge, we propose a new processing
framework called Neural Architecture Search- Temporal Convolutional (NAS-TC).
Our proposed framework is divided into two phases. In the first phase, the
classical CNN network is used as the backbone network to complete the
computationally intensive feature extraction task. In the second stage, a
simple stitching search to the cell is used to complete the relatively
lightweight long-range temporal-dependent information extraction. This ensures
our method will have more reasonable parameter assignments and can handle
minute-level videos. Finally, we conduct sufficient experiments on multiple
benchmark datasets and obtain competitive recognition accuracy.
Related papers
- Lightweight Neural Architecture Search for Temporal Convolutional
Networks at the Edge [21.72253397805102]
This work focuses in particular on Temporal Convolutional Networks (TCNs), a convolutional model for time-series processing.
We propose the first NAS tool that explicitly targets the optimization of the most peculiar architectural parameters of TCNs.
We test the proposed NAS on four real-world, edge-relevant tasks, involving audio and bio-signals.
arXiv Detail & Related papers (2023-01-24T19:47:40Z) - PV-NAS: Practical Neural Architecture Search for Video Recognition [83.77236063613579]
Deep neural networks for video tasks is highly customized and the design of such networks requires domain experts and costly trial and error tests.
Recent advance in network architecture search has boosted the image recognition performance in a large margin.
In this study, we propose a practical solution, namely Practical Video Neural Architecture Search (PV-NAS)
arXiv Detail & Related papers (2020-11-02T08:50:23Z) - Hierarchical Neural Architecture Search for Deep Stereo Matching [131.94481111956853]
We propose the first end-to-end hierarchical NAS framework for deep stereo matching.
Our framework incorporates task-specific human knowledge into the neural architecture search framework.
It is ranked at the top 1 accuracy on KITTI stereo 2012, 2015 and Middlebury benchmarks, as well as the top 1 on SceneFlow dataset.
arXiv Detail & Related papers (2020-10-26T11:57:37Z) - MS-RANAS: Multi-Scale Resource-Aware Neural Architecture Search [94.80212602202518]
We propose Multi-Scale Resource-Aware Neural Architecture Search (MS-RANAS)
We employ a one-shot architecture search approach in order to obtain a reduced search cost.
We achieve state-of-the-art results in terms of accuracy-speed trade-off.
arXiv Detail & Related papers (2020-09-29T11:56:01Z) - FNA++: Fast Network Adaptation via Parameter Remapping and Architecture
Search [35.61441231491448]
We propose a Fast Network Adaptation (FNA++) method, which can adapt both the architecture and parameters of a seed network.
In our experiments, we apply FNA++ on MobileNetV2 to obtain new networks for semantic segmentation, object detection, and human pose estimation.
The total computation cost of FNA++ is significantly less than SOTA segmentation and detection NAS approaches.
arXiv Detail & Related papers (2020-06-21T10:03:34Z) - NAS-Count: Counting-by-Density with Neural Architecture Search [74.92941571724525]
We automate the design of counting models with Neural Architecture Search (NAS)
We introduce an end-to-end searched encoder-decoder architecture, Automatic Multi-Scale Network (AMSNet)
arXiv Detail & Related papers (2020-02-29T09:18:17Z) - Fast Neural Network Adaptation via Parameter Remapping and Architecture
Search [35.61441231491448]
Deep neural networks achieve remarkable performance in many computer vision tasks.
Most state-of-the-art (SOTA) semantic segmentation and object detection approaches reuse neural network architectures designed for image classification as the backbone.
One major challenge though, is that ImageNet pre-training of the search space representation incurs huge computational cost.
In this paper, we propose a Fast Neural Network Adaptation (FNA) method, which can adapt both the architecture and parameters of a seed network.
arXiv Detail & Related papers (2020-01-08T13:45:15Z) - Scalable NAS with Factorizable Architectural Parameters [102.51428615447703]
Neural Architecture Search (NAS) is an emerging topic in machine learning and computer vision.
This paper presents a scalable algorithm by factorizing a large set of candidate operators into smaller subspaces.
With a small increase in search costs and no extra costs in re-training, we find interesting architectures that were not explored before.
arXiv Detail & Related papers (2019-12-31T10:26:56Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.