M-FasterSeg: An Efficient Semantic Segmentation Network Based on Neural
Architecture Search
- URL: http://arxiv.org/abs/2112.07918v1
- Date: Wed, 15 Dec 2021 06:46:55 GMT
- Title: M-FasterSeg: An Efficient Semantic Segmentation Network Based on Neural
Architecture Search
- Authors: Huiyu Kuang
- Abstract summary: This paper proposes an improved structure of a semantic segmentation network based on a deep learning network.
First, a neural network search method NAS (Neural Architecture Search) is used to find a semantic segmentation network with multiple resolution branches.
In the search process, combine the self-attention network structure module to adjust the searched neural network structure, and then combine the semantic segmentation network searched by different branches to form a fast semantic segmentation network structure.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Image semantic segmentation technology is one of the key technologies for
intelligent systems to understand natural scenes. As one of the important
research directions in the field of visual intelligence, this technology has
broad application scenarios in the fields of mobile robots, drones, smart
driving, and smart security. However, in the actual application of mobile
robots, problems such as inaccurate segmentation semantic label prediction and
loss of edge information of segmented objects and background may occur. This
paper proposes an improved structure of a semantic segmentation network based
on a deep learning network that combines self-attention neural network and
neural network architecture search methods. First, a neural network search
method NAS (Neural Architecture Search) is used to find a semantic segmentation
network with multiple resolution branches. In the search process, combine the
self-attention network structure module to adjust the searched neural network
structure, and then combine the semantic segmentation network searched by
different branches to form a fast semantic segmentation network structure, and
input the picture into the network structure to get the final forecast result.
The experimental results on the Cityscapes dataset show that the accuracy of
the algorithm is 69.8%, and the segmentation speed is 48/s. It achieves a good
balance between real-time and accuracy, can optimize edge segmentation, and has
a better performance in complex scenes. Good robustness is suitable for
practical application.
Related papers
- Image segmentation with traveling waves in an exactly solvable recurrent
neural network [71.74150501418039]
We show that a recurrent neural network can effectively divide an image into groups according to a scene's structural characteristics.
We present a precise description of the mechanism underlying object segmentation in this network.
We then demonstrate a simple algorithm for object segmentation that generalizes across inputs ranging from simple geometric objects in grayscale images to natural images.
arXiv Detail & Related papers (2023-11-28T16:46:44Z) - Real-Time Semantic Segmentation: A Brief Survey & Comparative Study in
Remote Sensing [13.278362721781978]
This paper begins with a summary of the fundamental compression methods for designing efficient deep neural networks.
We examine several seminal efficient deep learning methods, placing them in a taxonomy based on the network architecture design approach.
We evaluate the quality and efficiency of some existing efficient deep neural networks on a publicly available remote sensing semantic segmentation benchmark dataset.
arXiv Detail & Related papers (2023-09-12T08:30:48Z) - OFA$^2$: A Multi-Objective Perspective for the Once-for-All Neural
Architecture Search [79.36688444492405]
Once-for-All (OFA) is a Neural Architecture Search (NAS) framework designed to address the problem of searching efficient architectures for devices with different resources constraints.
We aim to give one step further in the search for efficiency by explicitly conceiving the search stage as a multi-objective optimization problem.
arXiv Detail & Related papers (2023-03-23T21:30:29Z) - Firefly Neural Architecture Descent: a General Approach for Growing
Neural Networks [50.684661759340145]
Firefly neural architecture descent is a general framework for progressively and dynamically growing neural networks.
We show that firefly descent can flexibly grow networks both wider and deeper, and can be applied to learn accurate but resource-efficient neural architectures.
In particular, it learns networks that are smaller in size but have higher average accuracy than those learned by the state-of-the-art methods.
arXiv Detail & Related papers (2021-02-17T04:47:18Z) - ABCNet: Attentive Bilateral Contextual Network for Efficient Semantic
Segmentation of Fine-Resolution Remote Sensing Images [5.753245638190626]
Semantic segmentation of remotely sensed images plays a crucial role in precision agriculture, environmental protection, and economic assessment.
Due to the complicated information caused by the increased spatial resolution, state-of-the-art deep learning algorithms normally utilize complex network architectures for segmentation.
We propose an Attentive Bilateral Contextual Network (ABCNet), a convolutional neural network (CNN) with double branches, with prominently lower computational consumptions compared to the cutting-edge algorithms.
arXiv Detail & Related papers (2021-02-04T10:43:08Z) - MS-RANAS: Multi-Scale Resource-Aware Neural Architecture Search [94.80212602202518]
We propose Multi-Scale Resource-Aware Neural Architecture Search (MS-RANAS)
We employ a one-shot architecture search approach in order to obtain a reduced search cost.
We achieve state-of-the-art results in terms of accuracy-speed trade-off.
arXiv Detail & Related papers (2020-09-29T11:56:01Z) - NAS-Navigator: Visual Steering for Explainable One-Shot Deep Neural
Network Synthesis [53.106414896248246]
We present a framework that allows analysts to effectively build the solution sub-graph space and guide the network search by injecting their domain knowledge.
Applying this technique in an iterative manner allows analysts to converge to the best performing neural network architecture for a given application.
arXiv Detail & Related papers (2020-09-28T01:48:45Z) - FNA++: Fast Network Adaptation via Parameter Remapping and Architecture
Search [35.61441231491448]
We propose a Fast Network Adaptation (FNA++) method, which can adapt both the architecture and parameters of a seed network.
In our experiments, we apply FNA++ on MobileNetV2 to obtain new networks for semantic segmentation, object detection, and human pose estimation.
The total computation cost of FNA++ is significantly less than SOTA segmentation and detection NAS approaches.
arXiv Detail & Related papers (2020-06-21T10:03:34Z) - Depthwise Non-local Module for Fast Salient Object Detection Using a
Single Thread [136.2224792151324]
We propose a new deep learning algorithm for fast salient object detection.
The proposed algorithm achieves competitive accuracy and high inference efficiency simultaneously with a single CPU thread.
arXiv Detail & Related papers (2020-01-22T15:23:48Z) - Fast Neural Network Adaptation via Parameter Remapping and Architecture
Search [35.61441231491448]
Deep neural networks achieve remarkable performance in many computer vision tasks.
Most state-of-the-art (SOTA) semantic segmentation and object detection approaches reuse neural network architectures designed for image classification as the backbone.
One major challenge though, is that ImageNet pre-training of the search space representation incurs huge computational cost.
In this paper, we propose a Fast Neural Network Adaptation (FNA) method, which can adapt both the architecture and parameters of a seed network.
arXiv Detail & Related papers (2020-01-08T13:45:15Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.