Vision-Based Autonomous Navigation for Unmanned Surface Vessel in
Extreme Marine Conditions
- URL: http://arxiv.org/abs/2308.04283v1
- Date: Tue, 8 Aug 2023 14:25:13 GMT
- Title: Vision-Based Autonomous Navigation for Unmanned Surface Vessel in
Extreme Marine Conditions
- Authors: Muhayyuddin Ahmed, Ahsan Baidar Bakht, Taimur Hassan, Waseem Akram,
Ahmed Humais, Lakmal Seneviratne, Shaoming He, Defu Lin, and Irfan Hussain
- Abstract summary: This paper presents an autonomous vision-based navigation framework for tracking target objects in extreme marine conditions.
The proposed framework has been thoroughly tested in simulation under extremely reduced visibility due to sandstorms and fog.
The results are compared with state-of-the-art de-hazing methods across the benchmarked MBZIRC simulation dataset.
- Score: 2.8983738640808645
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Visual perception is an important component for autonomous navigation of
unmanned surface vessels (USV), particularly for the tasks related to
autonomous inspection and tracking. These tasks involve vision-based navigation
techniques to identify the target for navigation. Reduced visibility under
extreme weather conditions in marine environments makes it difficult for
vision-based approaches to work properly. To overcome these issues, this paper
presents an autonomous vision-based navigation framework for tracking target
objects in extreme marine conditions. The proposed framework consists of an
integrated perception pipeline that uses a generative adversarial network (GAN)
to remove noise and highlight the object features before passing them to the
object detector (i.e., YOLOv5). The detected visual features are then used by
the USV to track the target. The proposed framework has been thoroughly tested
in simulation under extremely reduced visibility due to sandstorms and fog. The
results are compared with state-of-the-art de-hazing methods across the
benchmarked MBZIRC simulation dataset, on which the proposed scheme has
outperformed the existing methods across various metrics.
Related papers
- Benchmarking Vision-Based Object Tracking for USVs in Complex Maritime Environments [0.8796261172196743]
Vision-based target tracking is crucial for unmanned surface vehicles.
Real-time tracking in maritime environments is challenging due to dynamic camera movement, low visibility, and scale variation.
This study proposes a vision-guided object-tracking framework for USVs.
arXiv Detail & Related papers (2024-12-10T10:35:17Z) - A Cross-Scene Benchmark for Open-World Drone Active Tracking [54.235808061746525]
Drone Visual Active Tracking aims to autonomously follow a target object by controlling the motion system based on visual observations.
We propose a unified cross-scene cross-domain benchmark for open-world drone active tracking called DAT.
We also propose a reinforcement learning-based drone tracking method called R-VAT.
arXiv Detail & Related papers (2024-12-01T09:37:46Z) - OOSTraj: Out-of-Sight Trajectory Prediction With Vision-Positioning Denoising [49.86409475232849]
Trajectory prediction is fundamental in computer vision and autonomous driving.
Existing approaches in this field often assume precise and complete observational data.
We present a novel method for out-of-sight trajectory prediction that leverages a vision-positioning technique.
arXiv Detail & Related papers (2024-04-02T18:30:29Z) - Angle Robustness Unmanned Aerial Vehicle Navigation in GNSS-Denied
Scenarios [66.05091704671503]
We present a novel angle navigation paradigm to deal with flight deviation in point-to-point navigation tasks.
We also propose a model that includes the Adaptive Feature Enhance Module, Cross-knowledge Attention-guided Module and Robust Task-oriented Head Module.
arXiv Detail & Related papers (2024-02-04T08:41:20Z) - Deep Learning-Based Object Detection in Maritime Unmanned Aerial Vehicle
Imagery: Review and Experimental Comparisons [10.75221614844458]
We first briefly summarize four challenges for object detection on maritime UAVs, i.e. object feature diversity, device limitation, maritime environment variability, and dataset scarcity.
Next, we review the UAV aerial image/video datasets and propose a maritime UAV aerial dataset named MS2ship for ship detection.
arXiv Detail & Related papers (2023-11-14T07:20:38Z) - Efficient Real-time Smoke Filtration with 3D LiDAR for Search and Rescue
with Autonomous Heterogeneous Robotic Systems [56.838297900091426]
Smoke and dust affect the performance of any mobile robotic platform due to their reliance on onboard perception systems.
This paper proposes a novel modular computation filtration pipeline based on intensity and spatial information.
arXiv Detail & Related papers (2023-08-14T16:48:57Z) - Performance Study of YOLOv5 and Faster R-CNN for Autonomous Navigation
around Non-Cooperative Targets [0.0]
This paper discusses how the combination of cameras and machine learning algorithms can achieve the relative navigation task.
The performance of two deep learning-based object detection algorithms, Faster Region-based Convolutional Neural Networks (R-CNN) and You Only Look Once (YOLOv5) is tested.
The paper discusses the path to implementing the feature recognition algorithms and towards integrating them into the spacecraft Guidance Navigation and Control system.
arXiv Detail & Related papers (2023-01-22T04:53:38Z) - Safe Vessel Navigation Visually Aided by Autonomous Unmanned Aerial
Vehicles in Congested Harbors and Waterways [9.270928705464193]
This work is the first attempt to detect and estimate distances to unknown objects from long-range visual data captured with conventional RGB cameras and auxiliary absolute positioning systems (e.g. GPS)
The simulation results illustrate the accuracy and efficacy of the proposed method for visually aided navigation of vessels assisted by UAV.
arXiv Detail & Related papers (2021-08-09T08:15:17Z) - Perceiving Traffic from Aerial Images [86.994032967469]
We propose an object detection method called Butterfly Detector that is tailored to detect objects in aerial images.
We evaluate our Butterfly Detector on two publicly available UAV datasets (UAVDT and VisDrone 2019) and show that it outperforms previous state-of-the-art methods while remaining real-time.
arXiv Detail & Related papers (2020-09-16T11:37:43Z) - Object Goal Navigation using Goal-Oriented Semantic Exploration [98.14078233526476]
This work studies the problem of object goal navigation which involves navigating to an instance of the given object category in unseen environments.
We propose a modular system called, Goal-Oriented Semantic Exploration' which builds an episodic semantic map and uses it to explore the environment efficiently.
arXiv Detail & Related papers (2020-07-01T17:52:32Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.