Robot Goes Fishing: Rapid, High-Resolution Biological Hotspot Mapping in
Coral Reefs with Vision-Guided Autonomous Underwater Vehicles
- URL: http://arxiv.org/abs/2305.02330v3
- Date: Thu, 1 Feb 2024 15:34:36 GMT
- Title: Robot Goes Fishing: Rapid, High-Resolution Biological Hotspot Mapping in
Coral Reefs with Vision-Guided Autonomous Underwater Vehicles
- Authors: Daniel Yang, Levi Cai, Stewart Jamieson, Yogesh Girdhar
- Abstract summary: Biological hotspot detection can help coral reef managers prioritize limited resources for monitoring and intervention tasks.
Here, we explore the use of autonomous underwater vehicles (AUVs) with cameras, coupled with visual detectors and photogrammetry, to map and identify these hotspots.
To the best of our knowledge, we present one of the first attempts at using an AUV to gather visually-observed, fine-grain biological hotspot maps.
- Score: 6.658103076536836
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Coral reefs are fast-changing and complex ecosystems that are crucial to
monitor and study. Biological hotspot detection can help coral reef managers
prioritize limited resources for monitoring and intervention tasks. Here, we
explore the use of autonomous underwater vehicles (AUVs) with cameras, coupled
with visual detectors and photogrammetry, to map and identify these hotspots.
This approach can provide high spatial resolution information in fast feedback
cycles. To the best of our knowledge, we present one of the first attempts at
using an AUV to gather visually-observed, fine-grain biological hotspot maps in
concert with topography of a coral reefs. Our hotspot maps correlate with
rugosity, an established proxy metric for coral reef biodiversity and
abundance, as well as with our visual inspections of the 3D reconstruction. We
also investigate issues of scaling this approach when applied to new reefs by
using these visual detectors pre-trained on large public datasets.
Related papers
- The Coralscapes Dataset: Semantic Scene Understanding in Coral Reefs [4.096374910845255]
We release the first general-purpose dense semantic segmentation dataset for coral reefs, covering 2075 images, 39 benthic classes, and 174k segmentation masks annotated by experts.
We benchmark a wide range of semantic segmentation models, and find that transfer learning from Coralscapes to existing smaller datasets consistently leads to state-of-the-art performance.
Coralscapes will catalyze research on efficient, scalable, and standardized coral reef surveying methods based on computer vision, and holds the potential to streamline the development of underwater ecological robotics.
arXiv Detail & Related papers (2025-03-25T18:33:59Z) - Image-Based Relocalization and Alignment for Long-Term Monitoring of Dynamic Underwater Environments [57.59857784298534]
We propose an integrated pipeline that combines Visual Place Recognition (VPR), feature matching, and image segmentation on video-derived images.
This method enables robust identification of revisited areas, estimation of rigid transformations, and downstream analysis of ecosystem changes.
arXiv Detail & Related papers (2025-03-06T05:13:19Z) - From underwater to aerial: a novel multi-scale knowledge distillation approach for coral reef monitoring [1.0644791181419937]
This study presents a novel multi-scale approach to coral reef monitoring, integrating fine-scale underwater imagery with medium-scale aerial imagery.
A transformer-based deep-learning model is trained on underwater images to detect the presence of 31 classes covering various coral morphotypes, associated fauna, and habitats.
The results show that the multi-scale methodology successfully extends fine-scale classification to larger reef areas, achieving a high degree of accuracy in predicting coral morphotypes and associated habitats.
arXiv Detail & Related papers (2025-02-25T06:12:33Z) - Automatic Coral Detection with YOLO: A Deep Learning Approach for Efficient and Accurate Coral Reef Monitoring [0.0]
Coral reefs are vital ecosystems that are under increasing threat due to local human impacts and climate change.
In this paper, we present an automatic coral detection system utilizing the You Only Look Once deep learning model.
arXiv Detail & Related papers (2024-04-03T08:00:46Z) - Scalable Semantic 3D Mapping of Coral Reefs with Deep Learning [4.8902950939676675]
This paper presents a new paradigm for mapping underwater environments from ego-motion video.
We show high-precision 3D semantic mapping at unprecedented scale with significantly reduced required labor costs.
Our approach significantly scales up coral reef monitoring by taking a leap towards fully automatic analysis of video transects.
arXiv Detail & Related papers (2023-09-22T11:35:10Z) - Learning Heavily-Degraded Prior for Underwater Object Detection [59.5084433933765]
This paper seeks transferable prior knowledge from detector-friendly images.
It is based on statistical observations that, the heavily degraded regions of detector-friendly (DFUI) and underwater images have evident feature distribution gaps.
Our method with higher speeds and less parameters still performs better than transformer-based detectors.
arXiv Detail & Related papers (2023-08-24T12:32:46Z) - CUREE: A Curious Underwater Robot for Ecosystem Exploration [6.486523185975219]
The CUREE platform provides a unique set of capabilities in the form of robot behaviors and perception algorithms.
Examples of these capabilities include low-altitude visual surveys, soundscape surveys, habitat characterization, and animal following.
arXiv Detail & Related papers (2023-03-07T14:52:29Z) - Reef-insight: A framework for reef habitat mapping with clustering
methods via remote sensing [0.3670422696827526]
We present Reef-Insight, an unsupervised machine learning framework that features advanced clustering methods and remote sensing for reef habitat mapping.
Our framework compares different clustering methods for reef habitat mapping using remote sensing data.
Our results indicate that Reef-Insight can generate detailed reef habitat maps outlining distinct reef habitats.
arXiv Detail & Related papers (2023-01-26T00:03:09Z) - GAMMA: Generative Augmentation for Attentive Marine Debris Detection [0.0]
We propose an efficient and generative augmentation approach to solve the inadequacy concern of underwater debris data for visual detection.
We use cycleGAN as a data augmentation technique to convert openly available, abundant data of terrestrial plastic to underwater-style images.
We also propose a novel architecture for underwater debris detection using an attention mechanism.
arXiv Detail & Related papers (2022-12-07T16:30:51Z) - Deep object detection for waterbird monitoring using aerial imagery [56.1262568293658]
In this work, we present a deep learning pipeline that can be used to precisely detect, count, and monitor waterbirds using aerial imagery collected by a commercial drone.
By utilizing convolutional neural network-based object detectors, we show that we can detect 16 classes of waterbird species that are commonly found in colonial nesting islands along the Texas coast.
arXiv Detail & Related papers (2022-10-10T17:37:56Z) - Towards Generating Large Synthetic Phytoplankton Datasets for Efficient
Monitoring of Harmful Algal Blooms [77.25251419910205]
Harmful algal blooms (HABs) cause significant fish deaths in aquaculture farms.
Currently, the standard method to enumerate harmful algae and other phytoplankton is to manually observe and count them under a microscope.
We employ Generative Adversarial Networks (GANs) to generate synthetic images.
arXiv Detail & Related papers (2022-08-03T20:15:55Z) - Machine Learning for Glacier Monitoring in the Hindu Kush Himalaya [54.12023102155757]
Glacier mapping is key to ecological monitoring in the hkh region.
Climate change poses a risk to individuals whose livelihoods depend on the health of glacier ecosystems.
We present a machine learning based approach to support ecological monitoring, with a focus on glaciers.
arXiv Detail & Related papers (2020-12-09T12:48:06Z) - Movement Tracks for the Automatic Detection of Fish Behavior in Videos [63.85815474157357]
We offer a dataset of sablefish (Anoplopoma fimbria) startle behaviors in underwater videos, and investigate the use of deep learning (DL) methods for behavior detection on it.
Our proposed detection system identifies fish instances using DL-based frameworks, determines trajectory tracks, derives novel behavior-specific features, and employs Long Short-Term Memory (LSTM) networks to identify startle behavior in sablefish.
arXiv Detail & Related papers (2020-11-28T05:51:19Z) - Occupancy Anticipation for Efficient Exploration and Navigation [97.17517060585875]
We propose occupancy anticipation, where the agent uses its egocentric RGB-D observations to infer the occupancy state beyond the visible regions.
By exploiting context in both the egocentric views and top-down maps our model successfully anticipates a broader map of the environment.
Our approach is the winning entry in the 2020 Habitat PointNav Challenge.
arXiv Detail & Related papers (2020-08-21T03:16:51Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.