Dynamic Network selection for the Object Detection task: why it matters
and what we (didn't) achieve
- URL: http://arxiv.org/abs/2105.13279v1
- Date: Thu, 27 May 2021 16:25:18 GMT
- Title: Dynamic Network selection for the Object Detection task: why it matters
and what we (didn't) achieve
- Authors: Emanuele Vitali and Anton Lokhmotov and Gianluca Palermo
- Abstract summary: We show the potential benefit of a dynamic auto-tuning approach for the inference process in the Deep Neural Network context.
We benchmarked different neural networks to find the optimal detector for the well-known COCO 17 database.
- Score: 1.8467344387137519
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In this paper, we want to show the potential benefit of a dynamic auto-tuning
approach for the inference process in the Deep Neural Network (DNN) context,
tackling the object detection challenge. We benchmarked different neural
networks to find the optimal detector for the well-known COCO 17 database, and
we demonstrate that even if we only consider the quality of the prediction
there is not a single optimal network. This is even more evident if we also
consider the time to solution as a metric to evaluate, and then select, the
most suitable network. This opens to the possibility for an adaptive
methodology to switch among different object detection networks according to
run-time requirements (e.g. maximum quality subject to a time-to-solution
constraint).
Moreover, we demonstrated by developing an ad hoc oracle, that an additional
proactive methodology could provide even greater benefits, allowing us to
select the best network among the available ones given some characteristics of
the processed image. To exploit this method, we need to identify some image
features that can be used to steer the decision on the most promising network.
Despite the optimization opportunity that has been identified, we were not able
to identify a predictor function that validates this attempt neither adopting
classical image features nor by using a DNN classifier.
Related papers
- Unveiling the Power of Sparse Neural Networks for Feature Selection [60.50319755984697]
Sparse Neural Networks (SNNs) have emerged as powerful tools for efficient feature selection.
We show that SNNs trained with dynamic sparse training (DST) algorithms can achieve, on average, more than $50%$ memory and $55%$ FLOPs reduction.
Our findings show that feature selection with SNNs trained with DST algorithms can achieve, on average, more than $50%$ memory and $55%$ FLOPs reduction.
arXiv Detail & Related papers (2024-08-08T16:48:33Z) - Detection-Rate-Emphasized Multi-objective Evolutionary Feature Selection for Network Intrusion Detection [21.104686670216445]
We propose DR-MOFS to model the feature selection problem in network intrusion detection as a three-objective optimization problem.
In most cases, the proposed method can outperform previous methods, i.e., lead to fewer features, higher accuracy and detection rate.
arXiv Detail & Related papers (2024-06-13T14:42:17Z) - Effective Subset Selection Through The Lens of Neural Network Pruning [31.43307762723943]
It is important to select the data to be annotated wisely, which is known as the subset selection problem.
We investigate the relationship between subset selection and neural network pruning, which is more widely studied.
We propose utilizing the norm criterion of neural network features to improve subset selection methods.
arXiv Detail & Related papers (2024-06-03T08:12:32Z) - A distributed neural network architecture for dynamic sensor selection
with application to bandwidth-constrained body-sensor networks [53.022158485867536]
We propose a dynamic sensor selection approach for deep neural networks (DNNs)
It is able to derive an optimal sensor subset selection for each specific input sample instead of a fixed selection for the entire dataset.
We show how we can use this dynamic selection to increase the lifetime of a wireless sensor network (WSN) by imposing constraints on how often each node is allowed to transmit.
arXiv Detail & Related papers (2023-08-16T14:04:50Z) - R(Det)^2: Randomized Decision Routing for Object Detection [64.48369663018376]
We propose a novel approach to combine decision trees and deep neural networks in an end-to-end learning manner for object detection.
To facilitate effective learning, we propose randomized decision routing with node selective and associative losses.
We name this approach as the randomized decision routing for object detection, abbreviated as R(Det)$2$.
arXiv Detail & Related papers (2022-04-02T07:54:58Z) - Hybrid SNN-ANN: Energy-Efficient Classification and Object Detection for
Event-Based Vision [64.71260357476602]
Event-based vision sensors encode local pixel-wise brightness changes in streams of events rather than image frames.
Recent progress in object recognition from event-based sensors has come from conversions of deep neural networks.
We propose a hybrid architecture for end-to-end training of deep neural networks for event-based pattern recognition and object detection.
arXiv Detail & Related papers (2021-12-06T23:45:58Z) - Joint Learning of Neural Transfer and Architecture Adaptation for Image
Recognition [77.95361323613147]
Current state-of-the-art visual recognition systems rely on pretraining a neural network on a large-scale dataset and finetuning the network weights on a smaller dataset.
In this work, we prove that dynamically adapting network architectures tailored for each domain task along with weight finetuning benefits in both efficiency and effectiveness.
Our method can be easily generalized to an unsupervised paradigm by replacing supernet training with self-supervised learning in the source domain tasks and performing linear evaluation in the downstream tasks.
arXiv Detail & Related papers (2021-03-31T08:15:17Z) - Learning Deep Interleaved Networks with Asymmetric Co-Attention for
Image Restoration [65.11022516031463]
We present a deep interleaved network (DIN) that learns how information at different states should be combined for high-quality (HQ) images reconstruction.
In this paper, we propose asymmetric co-attention (AsyCA) which is attached at each interleaved node to model the feature dependencies.
Our presented DIN can be trained end-to-end and applied to various image restoration tasks.
arXiv Detail & Related papers (2020-10-29T15:32:00Z) - A Progressive Sub-Network Searching Framework for Dynamic Inference [33.93841415140311]
We propose a progressive sub-net searching framework, which is embedded with several effective techniques, including trainable noise ranking, channel group and fine-tuning threshold setting, sub-nets re-selection.
Our proposed method achieves much better dynamic inference accuracy compared with prior popular Universally-Slimmable-Network by 4.4%-maximally and 2.3%-averagely in ImageNet dataset with the same model size.
arXiv Detail & Related papers (2020-09-11T22:56:02Z) - Robust Image Matching By Dynamic Feature Selection [17.3367710589782]
Estimating dense correspondences between images is a long-standing image under-standing task.
Recent works introduce convolutional neural networks (CNNs) to extract high-level feature maps and find correspondences through feature matching.
We generate robust features by dynamically selecting features at different scales.
arXiv Detail & Related papers (2020-08-13T06:21:33Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.