Millimeter Wave Drones with Cameras: Computer Vision Aided Wireless Beam
Prediction
- URL: http://arxiv.org/abs/2211.07569v1
- Date: Mon, 14 Nov 2022 17:42:16 GMT
- Title: Millimeter Wave Drones with Cameras: Computer Vision Aided Wireless Beam
Prediction
- Authors: Gouranga Charan, Andrew Hredzak, and Ahmed Alkhateeb
- Abstract summary: Millimeter wave (mmWave) and terahertz (THz) drones have the potential to enable several futuristic applications.
These drones need to deploy large antenna arrays and use narrow directive beams to maintain a sufficient link budget.
This paper proposes a vision-aided machine learning-based approach that leverages visual data collected from cameras installed on the drones.
- Score: 8.919072533905517
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Millimeter wave (mmWave) and terahertz (THz) drones have the potential to
enable several futuristic applications such as coverage extension, enhanced
security monitoring, and disaster management. However, these drones need to
deploy large antenna arrays and use narrow directive beams to maintain a
sufficient link budget. The large beam training overhead associated with these
arrays makes adjusting these narrow beams challenging for highly-mobile drones.
To address these challenges, this paper proposes a vision-aided machine
learning-based approach that leverages visual data collected from cameras
installed on the drones to enable fast and accurate beam prediction. Further,
to facilitate the evaluation of the proposed solution, we build a synthetic
drone communication dataset consisting of co-existing wireless and visual data.
The proposed vision-aided solution achieves a top-$1$ beam prediction accuracy
of $\approx 91\%$ and close to $100\%$ top-$3$ accuracy. These results
highlight the efficacy of the proposed solution towards enabling highly mobile
mmWave/THz drone communication.
Related papers
- Radar Fields: Frequency-Space Neural Scene Representations for FMCW Radar [62.51065633674272]
We introduce Radar Fields - a neural scene reconstruction method designed for active radar imagers.
Our approach unites an explicit, physics-informed sensor model with an implicit neural geometry and reflectance model to directly synthesize raw radar measurements.
We validate the effectiveness of the method across diverse outdoor scenarios, including urban scenes with dense vehicles and infrastructure.
arXiv Detail & Related papers (2024-05-07T20:44:48Z) - Vehicle Cameras Guide mmWave Beams: Approach and Real-World V2V
Demonstration [13.117333069558812]
Accurately aligning millimeter-wave (mmWave) and terahertz (THz) narrow beams is essential to satisfy reliability and high data rates of 5G and beyond wireless communication systems.
We develop a deep learning solution for V2V scenarios to predict future beams using images from a 360 camera attached to the vehicle.
arXiv Detail & Related papers (2023-08-20T20:43:11Z) - TransVisDrone: Spatio-Temporal Transformer for Vision-based
Drone-to-Drone Detection in Aerial Videos [57.92385818430939]
Drone-to-drone detection using visual feed has crucial applications, such as detecting drone collisions, detecting drone attacks, or coordinating flight with other drones.
Existing methods are computationally costly, follow non-end-to-end optimization, and have complex multi-stage pipelines, making them less suitable for real-time deployment on edge devices.
We propose a simple yet effective framework, itTransVisDrone, that provides an end-to-end solution with higher computational efficiency.
arXiv Detail & Related papers (2022-10-16T03:05:13Z) - Neural Myerson Auction for Truthful and Energy-Efficient Autonomous
Aerial Data Delivery [9.986880167690364]
We introduce a data delivery drone to transfer collected surveillance data in harsh communication conditions.
This paper proposes a Myerson auction-based asynchronous data delivery in an aerial distributed data platform.
arXiv Detail & Related papers (2021-12-29T12:14:34Z) - DronePose: The identification, segmentation, and orientation detection
of drones via neural networks [3.161871054978445]
We present a CNN using a decision tree and ensemble structure to fully characterise drones in flight.
Our system determines the drone type, orientation (in terms of pitch, roll, and yaw), and performs segmentation to classify different body parts.
We also provide a computer model for the rapid generation of large quantities of accurately labelled photo-realistic training data.
arXiv Detail & Related papers (2021-12-10T12:34:53Z) - Dogfight: Detecting Drones from Drones Videos [58.158988162743825]
This paper attempts to address the problem of drones detection from other flying drones variations.
The erratic movement of the source and target drones, small size, arbitrary shape, large intensity, and occlusion make this problem quite challenging.
To handle this, instead of using region-proposal based methods, we propose to use a two-stage segmentation-based approach.
arXiv Detail & Related papers (2021-03-31T17:43:31Z) - Vision-based Drone Flocking in Outdoor Environments [9.184987303791292]
This letter proposes a vision-based detection and tracking algorithm for drone swarms.
We employ a convolutional neural network to detect and localize nearby agents onboard the quadcopters in real-time.
We show that the drones can safely navigate in an outdoor environment despite substantial background clutter and difficult lighting conditions.
arXiv Detail & Related papers (2020-12-02T14:44:40Z) - Real-Time Drone Detection and Tracking With Visible, Thermal and
Acoustic Sensors [66.4525391417921]
A thermal infrared camera is shown to be a feasible solution to the drone detection task.
The detector performance as a function of the sensor-to-target distance is also investigated.
A novel video dataset containing 650 annotated infrared and visible videos of drones, birds, airplanes and helicopters is also presented.
arXiv Detail & Related papers (2020-07-14T23:06:42Z) - University-1652: A Multi-view Multi-source Benchmark for Drone-based
Geo-localization [87.74121935246937]
We introduce a new multi-view benchmark for drone-based geo-localization, named University-1652.
University-1652 contains data from three platforms, i.e., synthetic drones, satellites and ground cameras of 1,652 university buildings around the world.
Experiments show that University-1652 helps the model to learn the viewpoint-invariant features and also has good generalization ability in the real-world scenario.
arXiv Detail & Related papers (2020-02-27T15:24:15Z) - Detection and Tracking Meet Drones Challenge [131.31749447313197]
This paper presents a review of object detection and tracking datasets and benchmarks, and discusses the challenges of collecting large-scale drone-based object detection and tracking datasets with manual annotations.
We describe our VisDrone dataset, which is captured over various urban/suburban areas of 14 different cities across China from North to South.
We provide a detailed analysis of the current state of the field of large-scale object detection and tracking on drones, and conclude the challenge as well as propose future directions.
arXiv Detail & Related papers (2020-01-16T00:11:56Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.