GAP9Shield: A 150GOPS AI-capable Ultra-low Power Module for Vision and Ranging Applications on Nano-drones
- URL: http://arxiv.org/abs/2407.13706v1
- Date: Thu, 27 Jun 2024 11:41:39 GMT
- Title: GAP9Shield: A 150GOPS AI-capable Ultra-low Power Module for Vision and Ranging Applications on Nano-drones
- Authors: Hanna Müller, Victor Kartsch, Luca Benini,
- Abstract summary: We present the GAP9Shield, a nano-drone-compatible module powered by the GAP9.
The system includes a 5MP OV5647 camera for high-definition imaging, a WiFi-BLE NINA module, and a 5D VL53L1-based ranging subsystem.
In comparison with similarly targeted state-of-the-art systems, GAP9Shield provides a 20% higher sample rate (RGB images) while offering a 20% weight reduction.
- Score: 12.436348876429367
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: The evolution of AI and digital signal processing technologies, combined with affordable energy-efficient processors, has propelled the development of both hardware and software for drone applications. Nano-drones, which fit into the palm of the hand, are suitable for indoor environments and safe for human interaction; however, they often fail to deliver the required performance for complex tasks due to the lack of hardware providing sufficient sensing and computing performance. Addressing this gap, we present the GAP9Shield, a nano-drone-compatible module powered by the GAP9, a 150GOPS-capable SoC. The system also includes a 5MP OV5647 camera for high-definition imaging, a WiFi-BLE NINA module, and a 5D VL53L1-based ranging subsystem, which enhances obstacle avoidance capabilities. In comparison with similarly targeted state-of-the-art systems, GAP9Shield provides a 20% higher sample rate (RGB images) while offering a 20% weight reduction. In this paper, we also highlight the energy efficiency and processing power capabilities of GAP9 for object detection (YOLO), localization, and mapping, which can run within a power envelope of below 100 mW and at low latency (as 17 ms for object detection), highlighting the transformative potential of GAP9 for the new generation of nano-drone applications.
Related papers
- Training on the Fly: On-device Self-supervised Learning aboard Nano-drones within 20 mW [52.280742520586756]
Miniaturized cyber-physical systems (CPSes) powered by tiny machine learning (TinyML), such as nano-drones, are becoming an increasingly attractive technology.
Simple electronics make these CPSes inexpensive, but strongly limit the computational, memory, and sensing resources available on board.
We present a novel on-device fine-tuning approach that relies only on the limited ultra-low power resources available aboard nano-drones.
arXiv Detail & Related papers (2024-08-06T13:11:36Z) - Tiny-PULP-Dronets: Squeezing Neural Networks for Faster and Lighter Inference on Multi-Tasking Autonomous Nano-Drones [12.96119439129453]
This work moves from PULP-Dronet, a State-of-the-Art convolutional neural network for autonomous navigation on nano-drones, to Tiny-PULP-Dronet, a novel methodology to squeeze by more than one order of magnitude model size.
This massive reduction paves the way towards affordable multi-tasking on nano-drones, a fundamental requirement for achieving high-level intelligence.
arXiv Detail & Related papers (2024-07-02T16:24:57Z) - A Deep Learning-based Pest Insect Monitoring System for Ultra-low Power Pocket-sized Drones [1.7945764007196348]
Smart farming and precision agriculture represent game-changer technologies for efficient and sustainable agribusiness.
Miniaturized palm-sized drones can act as flexible smart sensors inspecting crops, looking for early signs of potential pest outbreaking.
This work presents a novel vertically integrated solution featuring two ultra-low power System-on-Chips.
arXiv Detail & Related papers (2024-04-02T10:39:54Z) - High-throughput Visual Nano-drone to Nano-drone Relative Localization using Onboard Fully Convolutional Networks [51.23613834703353]
Relative drone-to-drone localization is a fundamental building block for any swarm operations.
We present a vertically integrated system based on a novel vision-based fully convolutional neural network (FCNN)
Our model results in an R-squared improvement from 32 to 47% on the horizontal image coordinate and from 18 to 55% on the vertical image coordinate, on a real-world dataset of 30k images.
arXiv Detail & Related papers (2024-02-21T12:34:31Z) - A Heterogeneous RISC-V based SoC for Secure Nano-UAV Navigation [40.8381466360025]
nano-UAVs face significant power and payload constraints while requiring advanced computing capabilities.
We present Shaheen, a 9mm2 200mW system-on-a-chip (SoC)
It integrates a Linux-capable RV64 core, compliant with the v1.0 ratified Hypervisor extension, along with a low-cost and low-power memory controller.
At the same time, it integrates a fully programmable energy- and area-efficient multi-core cluster of RV32 cores optimized for general-purpose DSP.
arXiv Detail & Related papers (2024-01-07T16:03:47Z) - Flexible and Fully Quantized Ultra-Lightweight TinyissimoYOLO for
Ultra-Low-Power Edge Systems [13.266626571886354]
We deploy variants of TinyissimoYOLO on state-of-the-art ultra-low-power extreme edge platforms.
This paper presents a comparison on latency, energy efficiency, and their ability to efficiently parallelize the workload.
arXiv Detail & Related papers (2023-07-12T08:26:27Z) - Ultra-low Power Deep Learning-based Monocular Relative Localization
Onboard Nano-quadrotors [64.68349896377629]
This work presents a novel autonomous end-to-end system that addresses the monocular relative localization, through deep neural networks (DNNs), of two peer nano-drones.
To cope with the ultra-constrained nano-drone platform, we propose a vertically-integrated framework, including dataset augmentation, quantization, and system optimizations.
Experimental results show that our DNN can precisely localize a 10cm-size target nano-drone by employing only low-resolution monochrome images, up to 2m distance.
arXiv Detail & Related papers (2023-03-03T14:14:08Z) - Deep Neural Network Architecture Search for Accurate Visual Pose
Estimation aboard Nano-UAVs [69.19616451596342]
Miniaturized unmanned aerial vehicles (UAVs) are an emerging and trending topic.
We leverage a novel neural architecture search (NAS) technique to automatically identify several convolutional neural networks (CNNs) for a visual pose estimation task.
Our results improve the State-of-the-Art by reducing the in-field control error of 32% while achieving a real-time onboard inference-rate of 10Hz@10mW and 50Hz@90mW.
arXiv Detail & Related papers (2023-03-03T14:02:09Z) - Robustifying the Deployment of tinyML Models for Autonomous
mini-vehicles [61.27933385742613]
We propose a closed-loop learning flow for autonomous driving mini-vehicles that includes the target environment in-the-loop.
We leverage a family of tinyCNNs to control the mini-vehicle, which learn in the target environment by imitating a computer vision algorithm, i.e., the expert.
When running the family of CNNs, our solution outperforms any other implementation on the STM32L4 and k64f (Cortex-M4), reducing the latency by over 13x and the energy consummation by 92%.
arXiv Detail & Related papers (2020-07-01T07:54:26Z) - DepthNet Nano: A Highly Compact Self-Normalizing Neural Network for
Monocular Depth Estimation [76.90627702089357]
DepthNet Nano is a compact deep neural network for monocular depth estimation designed using a human machine collaborative design strategy.
The proposed DepthNet Nano possesses a highly efficient network architecture, while still achieving comparable performance with state-of-the-art networks.
arXiv Detail & Related papers (2020-04-17T00:41:35Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.