CHASE: Robust Visual Tracking via Cell-Level Differentiable Neural
Architecture Search
- URL: http://arxiv.org/abs/2107.03463v1
- Date: Fri, 2 Jul 2021 15:16:45 GMT
- Title: CHASE: Robust Visual Tracking via Cell-Level Differentiable Neural
Architecture Search
- Authors: Seyed Mojtaba Marvasti-Zadeh, Javad Khaghani, Li Cheng, Hossein
Ghanei-Yakhdan, Shohreh Kasaei
- Abstract summary: We propose a novel cell-level differentiable architecture search mechanism to automate the network design of the tracking module.
The proposed approach is simple, efficient, and with no need to stack a series of modules to construct a network.
Our approach is easy to be incorporated into existing trackers, which is empirically validated using different differentiable architecture search-based methods and tracking objectives.
- Score: 14.702573109803307
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: A strong visual object tracker nowadays relies on its well-crafted modules,
which typically consist of manually-designed network architectures to deliver
high-quality tracking results. Not surprisingly, the manual design process
becomes a particularly challenging barrier, as it demands sufficient prior
experience, enormous effort, intuition and perhaps some good luck. Meanwhile,
neural architecture search has gaining grounds in practical applications such
as image segmentation, as a promising method in tackling the issue of automated
search of feasible network structures. In this work, we propose a novel
cell-level differentiable architecture search mechanism to automate the network
design of the tracking module, aiming to adapt backbone features to the
objective of a tracking network during offline training. The proposed approach
is simple, efficient, and with no need to stack a series of modules to
construct a network. Our approach is easy to be incorporated into existing
trackers, which is empirically validated using different differentiable
architecture search-based methods and tracking objectives. Extensive
experimental evaluations demonstrate the superior performance of our approach
over five commonly-used benchmarks. Meanwhile, our automated searching process
takes 41 (18) hours for the second (first) order DARTS method on the
TrackingNet dataset.
Related papers
- EM-DARTS: Hierarchical Differentiable Architecture Search for Eye Movement Recognition [54.99121380536659]
Eye movement biometrics have received increasing attention thanks to its high secure identification.
Deep learning (DL) models have been recently successfully applied for eye movement recognition.
DL architecture still is determined by human prior knowledge.
We propose EM-DARTS, a hierarchical differentiable architecture search algorithm to automatically design the DL architecture for eye movement recognition.
arXiv Detail & Related papers (2024-09-22T13:11:08Z) - Exploring Dynamic Transformer for Efficient Object Tracking [58.120191254379854]
We propose DyTrack, a dynamic transformer framework for efficient tracking.
DyTrack automatically learns to configure proper reasoning routes for various inputs, gaining better utilization of the available computational budget.
Experiments on multiple benchmarks demonstrate that DyTrack achieves promising speed-precision trade-offs with only a single model.
arXiv Detail & Related papers (2024-03-26T12:31:58Z) - Multi-conditioned Graph Diffusion for Neural Architecture Search [8.290336491323796]
We present a graph diffusion-based NAS approach that uses discrete conditional graph diffusion processes to generate high-performing neural network architectures.
We show promising results on six standard benchmarks, yielding novel and unique architectures at a fast speed.
arXiv Detail & Related papers (2024-03-09T21:45:31Z) - Masked Autoencoders Are Robust Neural Architecture Search Learners [14.965550562292476]
We propose a novel NAS framework based on Masked Autoencoders (MAE) that eliminates the need for labeled data during the search process.
By replacing the supervised learning objective with an image reconstruction task, our approach enables the robust discovery of network architectures.
arXiv Detail & Related papers (2023-11-20T13:45:21Z) - FocusFormer: Focusing on What We Need via Architecture Sampler [45.150346855368]
Vision Transformers (ViTs) have underpinned the recent breakthroughs in computer vision.
One-shot neural architecture search decouples the supernet training and architecture specialization for diverse deployment scenarios.
We devise a simple yet effective method, called FocusFormer, to bridge such a gap.
arXiv Detail & Related papers (2022-08-23T10:42:56Z) - Surrogate-assisted Multi-objective Neural Architecture Search for
Real-time Semantic Segmentation [11.866947846619064]
neural architecture search (NAS) has emerged as a promising avenue toward automating the design of architectures.
We propose a surrogate-assisted multi-objective method to address the challenges of applying NAS to semantic segmentation.
Our method can identify architectures significantly outperforming existing state-of-the-art architectures designed both manually by human experts and automatically by other NAS methods.
arXiv Detail & Related papers (2022-08-14T10:18:51Z) - Correlation-Aware Deep Tracking [83.51092789908677]
We propose a novel target-dependent feature network inspired by the self-/cross-attention scheme.
Our network deeply embeds cross-image feature correlation in multiple layers of the feature network.
Our model can be flexibly pre-trained on abundant unpaired images, leading to notably faster convergence than the existing methods.
arXiv Detail & Related papers (2022-03-03T11:53:54Z) - Conceptual Expansion Neural Architecture Search (CENAS) [1.3464152928754485]
We present an approach called Conceptual Expansion Neural Architecture Search (CENAS)
It combines a sample-efficient, computational creativity-inspired transfer learning approach with neural architecture search.
It finds models faster than naive architecture search via transferring existing weights to approximate the parameters of the new model.
arXiv Detail & Related papers (2021-10-07T02:29:26Z) - A Design Space Study for LISTA and Beyond [79.76740811464597]
In recent years, great success has been witnessed in building problem-specific deep networks from unrolling iterative algorithms.
This paper revisits the role of unrolling as a design approach for deep networks, to what extent its resulting special architecture is superior, and can we find better?
Using LISTA for sparse recovery as a representative example, we conduct the first thorough design space study for the unrolled models.
arXiv Detail & Related papers (2021-04-08T23:01:52Z) - NAS-Navigator: Visual Steering for Explainable One-Shot Deep Neural
Network Synthesis [53.106414896248246]
We present a framework that allows analysts to effectively build the solution sub-graph space and guide the network search by injecting their domain knowledge.
Applying this technique in an iterative manner allows analysts to converge to the best performing neural network architecture for a given application.
arXiv Detail & Related papers (2020-09-28T01:48:45Z) - AutoOD: Automated Outlier Detection via Curiosity-guided Search and
Self-imitation Learning [72.99415402575886]
Outlier detection is an important data mining task with numerous practical applications.
We propose AutoOD, an automated outlier detection framework, which aims to search for an optimal neural network model.
Experimental results on various real-world benchmark datasets demonstrate that the deep model identified by AutoOD achieves the best performance.
arXiv Detail & Related papers (2020-06-19T18:57:51Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.