IBURD: Image Blending for Underwater Robotic Detection
- URL: http://arxiv.org/abs/2502.17706v1
- Date: Mon, 24 Feb 2025 22:56:49 GMT
- Title: IBURD: Image Blending for Underwater Robotic Detection
- Authors: Jungseok Hong, Sakshi Singh, Junaed Sattar,
- Abstract summary: IBURD generates both images of underwater debris and their pixel-level annotations.<n>IBURD is able to robustly blend transparent objects into arbitrary backgrounds.
- Score: 17.217395753087157
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We present an image blending pipeline, \textit{IBURD}, that creates realistic synthetic images to assist in the training of deep detectors for use on underwater autonomous vehicles (AUVs) for marine debris detection tasks. Specifically, IBURD generates both images of underwater debris and their pixel-level annotations, using source images of debris objects, their annotations, and target background images of marine environments. With Poisson editing and style transfer techniques, IBURD is even able to robustly blend transparent objects into arbitrary backgrounds and automatically adjust the style of blended images using the blurriness metric of target background images. These generated images of marine debris in actual underwater backgrounds address the data scarcity and data variety problems faced by deep-learned vision algorithms in challenging underwater conditions, and can enable the use of AUVs for environmental cleanup missions. Both quantitative and robotic evaluations of IBURD demonstrate the efficacy of the proposed approach for robotic detection of marine debris.
Related papers
- Learning Underwater Active Perception in Simulation [51.205673783866146]
Turbidity can jeopardise the whole mission as it may prevent correct visual documentation of the inspected structures.
Previous works have introduced methods to adapt to turbidity and backscattering.
We propose a simple yet efficient approach to enable high-quality image acquisition of assets in a broad range of water conditions.
arXiv Detail & Related papers (2025-04-23T06:48:38Z) - Image-Based Relocalization and Alignment for Long-Term Monitoring of Dynamic Underwater Environments [57.59857784298534]
We propose an integrated pipeline that combines Visual Place Recognition (VPR), feature matching, and image segmentation on video-derived images.
This method enables robust identification of revisited areas, estimation of rigid transformations, and downstream analysis of ecosystem changes.
arXiv Detail & Related papers (2025-03-06T05:13:19Z) - Color Information-Based Automated Mask Generation for Detecting Underwater Atypical Glare Areas [0.0]
This study introduces a breath bubble detection algorithm that utilizes unsupervised K-means clustering.
The proposed method fuses color data and relative spatial coordinates from underwater images, employs CLAHE to mitigate noise, and subsequently performs pixel clustering to isolate reflective regions.
Experimental results demonstrate that the algorithm can effectively detect regions corresponding to breath bubbles in underwater images, and that the combined use of RGB, LAB, and HSV color spaces significantly enhances detection accuracy.
arXiv Detail & Related papers (2025-02-23T11:17:20Z) - Enhancing Marine Debris Acoustic Monitoring by Optical Flow-Based Motion Vector Analysis [0.0]
The paper proposes an optical flow-based method for marine debris monitoring.<n>The proposed method was validated through experiments conducted in a circulating water tank.
arXiv Detail & Related papers (2024-12-28T08:55:37Z) - UW-SDF: Exploiting Hybrid Geometric Priors for Neural SDF Reconstruction from Underwater Multi-view Monocular Images [63.32490897641344]
We propose a framework for reconstructing target objects from multi-view underwater images based on neural SDF.
We introduce hybrid geometric priors to optimize the reconstruction process, markedly enhancing the quality and efficiency of neural SDF reconstruction.
arXiv Detail & Related papers (2024-10-10T16:33:56Z) - A deep learning approach for marine snow synthesis and removal [55.86191108738564]
This paper proposes a novel method to reduce the marine snow interference using deep learning techniques.
We first synthesize realistic marine snow samples by training a Generative Adversarial Network (GAN) model.
We then train a U-Net model to perform marine snow removal as an image to image translation task.
arXiv Detail & Related papers (2023-11-27T07:19:41Z) - An Efficient Detection and Control System for Underwater Docking using
Machine Learning and Realistic Simulation: A Comprehensive Approach [5.039813366558306]
This work compares different deep-learning architectures to perform underwater docking detection and classification.
A Generative Adversarial Network (GAN) is used to do image-to-image translation, converting the Gazebo simulation image into an underwater-looking image.
Results show an improvement of 20% in the high turbidity scenarios regardless of the underwater currents.
arXiv Detail & Related papers (2023-11-02T18:10:20Z) - Learning Heavily-Degraded Prior for Underwater Object Detection [59.5084433933765]
This paper seeks transferable prior knowledge from detector-friendly images.
It is based on statistical observations that, the heavily degraded regions of detector-friendly (DFUI) and underwater images have evident feature distribution gaps.
Our method with higher speeds and less parameters still performs better than transformer-based detectors.
arXiv Detail & Related papers (2023-08-24T12:32:46Z) - Unpaired Overwater Image Defogging Using Prior Map Guided CycleGAN [60.257791714663725]
We propose a Prior map Guided CycleGAN (PG-CycleGAN) for defogging of images with overwater scenes.
The proposed method outperforms the state-of-the-art supervised, semi-supervised, and unsupervised defogging approaches.
arXiv Detail & Related papers (2022-12-23T03:00:28Z) - GAMMA: Generative Augmentation for Attentive Marine Debris Detection [0.0]
We propose an efficient and generative augmentation approach to solve the inadequacy concern of underwater debris data for visual detection.
We use cycleGAN as a data augmentation technique to convert openly available, abundant data of terrestrial plastic to underwater-style images.
We also propose a novel architecture for underwater debris detection using an attention mechanism.
arXiv Detail & Related papers (2022-12-07T16:30:51Z) - A Multi-purpose Real Haze Benchmark with Quantifiable Haze Levels and
Ground Truth [61.90504318229845]
This paper introduces the first paired real image benchmark dataset with hazy and haze-free images, and in-situ haze density measurements.
This dataset was produced in a controlled environment with professional smoke generating machines that covered the entire scene.
A subset of this dataset has been used for the Object Detection in Haze Track of CVPR UG2 2022 challenge.
arXiv Detail & Related papers (2022-06-13T19:14:06Z) - Underwater Image Restoration via Contrastive Learning and a Real-world
Dataset [59.35766392100753]
We present a novel method for underwater image restoration based on unsupervised image-to-image translation framework.
Our proposed method leveraged contrastive learning and generative adversarial networks to maximize the mutual information between raw and restored images.
arXiv Detail & Related papers (2021-06-20T16:06:26Z) - A Benchmark dataset for both underwater image enhancement and underwater
object detection [34.25890702670983]
We provide a large-scale underwater object detection dataset with both bounding box annotations and high quality reference images.
The OUC dataset provides a platform to comprehensive study the influence of underwater image enhancement algorithms on the underwater object detection task.
arXiv Detail & Related papers (2020-06-29T03:12:50Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.