Channel-Aware Distillation Transformer for Depth Estimation on Nano
Drones
- URL: http://arxiv.org/abs/2303.10386v1
- Date: Sat, 18 Mar 2023 10:45:34 GMT
- Title: Channel-Aware Distillation Transformer for Depth Estimation on Nano
Drones
- Authors: Ning Zhang, Francesco Nex, George Vosselman, Norman Kerle
- Abstract summary: This paper presents a lightweight CNN depth estimation network deployed on nano drones for obstacle avoidance.
Inspired by Knowledge Distillation (KD), a Channel-Aware Distillation Transformer (CADiT) is proposed to facilitate the small network.
The proposed method is validated on the KITTI dataset and tested on a nano drone Crazyflie, with an ultra-low power microprocessor GAP8.
- Score: 9.967643080731683
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Autonomous navigation of drones using computer vision has achieved promising
performance. Nano-sized drones based on edge computing platforms are
lightweight, flexible, and cheap, thus suitable for exploring narrow spaces.
However, due to their extremely limited computing power and storage, vision
algorithms designed for high-performance GPU platforms cannot be used for nano
drones. To address this issue this paper presents a lightweight CNN depth
estimation network deployed on nano drones for obstacle avoidance. Inspired by
Knowledge Distillation (KD), a Channel-Aware Distillation Transformer (CADiT)
is proposed to facilitate the small network to learn knowledge from a larger
network. The proposed method is validated on the KITTI dataset and tested on a
nano drone Crazyflie, with an ultra-low power microprocessor GAP8.
Related papers
- Training on the Fly: On-device Self-supervised Learning aboard Nano-drones within 20 mW [52.280742520586756]
Miniaturized cyber-physical systems (CPSes) powered by tiny machine learning (TinyML), such as nano-drones, are becoming an increasingly attractive technology.
Simple electronics make these CPSes inexpensive, but strongly limit the computational, memory, and sensing resources available on board.
We present a novel on-device fine-tuning approach that relies only on the limited ultra-low power resources available aboard nano-drones.
arXiv Detail & Related papers (2024-08-06T13:11:36Z) - Tiny-PULP-Dronets: Squeezing Neural Networks for Faster and Lighter Inference on Multi-Tasking Autonomous Nano-Drones [12.96119439129453]
This work moves from PULP-Dronet, a State-of-the-Art convolutional neural network for autonomous navigation on nano-drones, to Tiny-PULP-Dronet, a novel methodology to squeeze by more than one order of magnitude model size.
This massive reduction paves the way towards affordable multi-tasking on nano-drones, a fundamental requirement for achieving high-level intelligence.
arXiv Detail & Related papers (2024-07-02T16:24:57Z) - Optimized Deployment of Deep Neural Networks for Visual Pose Estimation
on Nano-drones [9.806742394395322]
Miniaturized unmanned aerial vehicles (UAVs) are gaining popularity due to their small size, enabling new tasks such as indoor navigation or people monitoring.
This work proposes a new automatic optimization pipeline for visual pose estimation tasks using Deep Neural Networks (DNNs)
Our results improve the state-of-the-art reducing inference latency by up to 3.22x at iso-error.
arXiv Detail & Related papers (2024-02-23T11:35:57Z) - High-throughput Visual Nano-drone to Nano-drone Relative Localization using Onboard Fully Convolutional Networks [51.23613834703353]
Relative drone-to-drone localization is a fundamental building block for any swarm operations.
We present a vertically integrated system based on a novel vision-based fully convolutional neural network (FCNN)
Our model results in an R-squared improvement from 32 to 47% on the horizontal image coordinate and from 18 to 55% on the vertical image coordinate, on a real-world dataset of 30k images.
arXiv Detail & Related papers (2024-02-21T12:34:31Z) - Ultra-low Power Deep Learning-based Monocular Relative Localization
Onboard Nano-quadrotors [64.68349896377629]
This work presents a novel autonomous end-to-end system that addresses the monocular relative localization, through deep neural networks (DNNs), of two peer nano-drones.
To cope with the ultra-constrained nano-drone platform, we propose a vertically-integrated framework, including dataset augmentation, quantization, and system optimizations.
Experimental results show that our DNN can precisely localize a 10cm-size target nano-drone by employing only low-resolution monochrome images, up to 2m distance.
arXiv Detail & Related papers (2023-03-03T14:14:08Z) - Deep Neural Network Architecture Search for Accurate Visual Pose
Estimation aboard Nano-UAVs [69.19616451596342]
Miniaturized unmanned aerial vehicles (UAVs) are an emerging and trending topic.
We leverage a novel neural architecture search (NAS) technique to automatically identify several convolutional neural networks (CNNs) for a visual pose estimation task.
Our results improve the State-of-the-Art by reducing the in-field control error of 32% while achieving a real-time onboard inference-rate of 10Hz@10mW and 50Hz@90mW.
arXiv Detail & Related papers (2023-03-03T14:02:09Z) - TransVisDrone: Spatio-Temporal Transformer for Vision-based
Drone-to-Drone Detection in Aerial Videos [57.92385818430939]
Drone-to-drone detection using visual feed has crucial applications, such as detecting drone collisions, detecting drone attacks, or coordinating flight with other drones.
Existing methods are computationally costly, follow non-end-to-end optimization, and have complex multi-stage pipelines, making them less suitable for real-time deployment on edge devices.
We propose a simple yet effective framework, itTransVisDrone, that provides an end-to-end solution with higher computational efficiency.
arXiv Detail & Related papers (2022-10-16T03:05:13Z) - NanoFlowNet: Real-time Dense Optical Flow on a Nano Quadcopter [11.715961583058226]
Nano quadcopters are small, agile, and cheap platforms that are well suited for deployment in narrow, cluttered environments.
Recent machine learning developments promise high-performance perception at low latency.
dedicated edge computing hardware has the potential to augment the processing capabilities of these limited devices.
We present NanoFlowNet, a lightweight convolutional neural network for real-time dense optical flow estimation on edge computing hardware.
arXiv Detail & Related papers (2022-09-14T20:35:51Z) - DepthNet Nano: A Highly Compact Self-Normalizing Neural Network for
Monocular Depth Estimation [76.90627702089357]
DepthNet Nano is a compact deep neural network for monocular depth estimation designed using a human machine collaborative design strategy.
The proposed DepthNet Nano possesses a highly efficient network architecture, while still achieving comparable performance with state-of-the-art networks.
arXiv Detail & Related papers (2020-04-17T00:41:35Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.