Leveraging 6DoF Pose Foundation Models For Mapping Marine Sediment Burial
- URL: http://arxiv.org/abs/2506.10386v1
- Date: Thu, 12 Jun 2025 06:21:00 GMT
- Title: Leveraging 6DoF Pose Foundation Models For Mapping Marine Sediment Burial
- Authors: Jerry Yan, Chinmay Talegaonkar, Nicholas Antipa, Eric Terrill, Sophia Merrifield,
- Abstract summary: This work introduces a computer vision pipeline, called PoseIDON, to estimate six degrees of freedom object pose and the orientation of the surrounding seafloor from ROV video.<n>The method is validated using footage of 54 objects, including barrels and munitions, recorded at a historic ocean dumpsite in the San Pedro Basin.<n>This approach enables scalable, non-invasive mapping of seafloor burial and supports environmental assessment at contaminated sites.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The burial state of anthropogenic objects on the seafloor provides insight into localized sedimentation dynamics and is also critical for assessing ecological risks, potential pollutant transport, and the viability of recovery or mitigation strategies for hazardous materials such as munitions. Accurate burial depth estimation from remote imagery remains difficult due to partial occlusion, poor visibility, and object degradation. This work introduces a computer vision pipeline, called PoseIDON, which combines deep foundation model features with multiview photogrammetry to estimate six degrees of freedom object pose and the orientation of the surrounding seafloor from ROV video. Burial depth is inferred by aligning CAD models of the objects with observed imagery and fitting a local planar approximation of the seafloor. The method is validated using footage of 54 objects, including barrels and munitions, recorded at a historic ocean dumpsite in the San Pedro Basin. The model achieves a mean burial depth error of approximately 10 centimeters and resolves spatial burial patterns that reflect underlying sediment transport processes. This approach enables scalable, non-invasive mapping of seafloor burial and supports environmental assessment at contaminated sites.
Related papers
- Tree-Mamba: A Tree-Aware Mamba for Underwater Monocular Depth Estimation [85.17735565146106]
Underwater Monocular Depth Estimation (UMDE) is a critical task that aims to estimate high-precision depth maps from underwater degraded images.<n>We develop a novel tree-aware Mamba method, dubbed Tree-Mamba, for estimating accurate monocular depth maps from underwater degraded images.<n>We construct an underwater depth estimation benchmark (called BlueDepth), which consists of 38,162 underwater image pairs with reliable depth labels.
arXiv Detail & Related papers (2025-07-10T12:10:51Z) - ReconMOST: Multi-Layer Sea Temperature Reconstruction with Observations-Guided Diffusion [48.540756751934836]
ReconMOST is a data-driven guided diffusion model framework for multi-layer sea temperature reconstruction.<n>Our method extends ML-based SST reconstruction to a global, multi-layer setting, handling over 92.5% missing data.
arXiv Detail & Related papers (2025-06-12T06:27:22Z) - Image-Based Relocalization and Alignment for Long-Term Monitoring of Dynamic Underwater Environments [57.59857784298534]
We propose an integrated pipeline that combines Visual Place Recognition (VPR), feature matching, and image segmentation on video-derived images.<n>This method enables robust identification of revisited areas, estimation of rigid transformations, and downstream analysis of ecosystem changes.
arXiv Detail & Related papers (2025-03-06T05:13:19Z) - Enhancing Marine Debris Acoustic Monitoring by Optical Flow-Based Motion Vector Analysis [0.0]
The paper proposes an optical flow-based method for marine debris monitoring.<n>The proposed method was validated through experiments conducted in a circulating water tank.
arXiv Detail & Related papers (2024-12-28T08:55:37Z) - ScaleDepth: Decomposing Metric Depth Estimation into Scale Prediction and Relative Depth Estimation [62.600382533322325]
We propose a novel monocular depth estimation method called ScaleDepth.
Our method decomposes metric depth into scene scale and relative depth, and predicts them through a semantic-aware scale prediction module.
Our method achieves metric depth estimation for both indoor and outdoor scenes in a unified framework.
arXiv Detail & Related papers (2024-07-11T05:11:56Z) - Deep Learning Innovations for Underwater Waste Detection: An In-Depth Analysis [0.0]
This paper conducts a comprehensive review of state-of-the-art architectures and on the existing datasets to establish a baseline for submerged waste and trash detection.
The primary goal remains to establish the benchmark of the object localization techniques to be leveraged by advanced underwater sensors and autonomous underwater vehicles.
arXiv Detail & Related papers (2024-05-28T15:51:18Z) - Large-scale Detection of Marine Debris in Coastal Areas with Sentinel-2 [3.6842260407632903]
Efforts to quantify marine pollution are often conducted with sparse and expensive beach surveys.
Satellite data of coastal areas is readily available and can be leveraged to detect aggregations of marine debris containing plastic litter.
We present a detector for marine debris built on a deep segmentation model that outputs a probability for marine debris at the pixel level.
arXiv Detail & Related papers (2023-07-05T17:38:48Z) - The Drunkard's Odometry: Estimating Camera Motion in Deforming Scenes [79.00228778543553]
This dataset is the first large set of exploratory camera trajectories with ground truth inside 3D scenes.
Simulations in realistic 3D buildings lets us obtain a vast amount of data and ground truth labels.
We present a novel deformable odometry method, dubbed the Drunkard's Odometry, which decomposes optical flow estimates into rigid-body camera motion.
arXiv Detail & Related papers (2023-06-29T13:09:31Z) - Optimized Custom Dataset for Efficient Detection of Underwater Trash [3.2634122554914002]
This paper proposes the development of a custom dataset and an efficient detection approach for submerged marine debris.
The dataset encompasses diverse underwater environments and incorporates annotations for precise labeling of debris instances.
Ultimately, the primary objective of this custom dataset is to enhance the diversity of litter instances and improve their detection accuracy in deep submerged environments by leveraging state-of-the-art deep learning architectures.
arXiv Detail & Related papers (2023-05-25T20:28:04Z) - 6D Camera Relocalization in Visually Ambiguous Extreme Environments [79.68352435957266]
We propose a novel method to reliably estimate the pose of a camera given a sequence of images acquired in extreme environments such as deep seas or extraterrestrial terrains.
Our method achieves comparable performance with state-of-the-art methods on the indoor benchmark (7-Scenes dataset) using only 20% training data.
arXiv Detail & Related papers (2022-07-13T16:40:02Z) - DeepPlastic: A Novel Approach to Detecting Epipelagic Bound Plastic
Using Deep Visual Models [0.0]
Currently, the most common monitoring method to quantify floating plastic requires the use of a manta trawl.
The need for physical removal before analysis incurs high costs and requires intensive labor preventing scalable deployment of a real-time marine plastic monitoring service.
This study presents a highly scalable workflow that utilizes images captured within the epipelagic layer of the ocean as an input.
arXiv Detail & Related papers (2021-05-05T06:04:26Z) - Generating Physically-Consistent Satellite Imagery for Climate Visualizations [53.61991820941501]
We train a generative adversarial network to create synthetic satellite imagery of future flooding and reforestation events.
A pure deep learning-based model can generate flood visualizations but hallucinates floods at locations that were not susceptible to flooding.
We publish our code and dataset for segmentation guided image-to-image translation in Earth observation.
arXiv Detail & Related papers (2021-04-10T15:00:15Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.