Removing fluid lensing effects from spatial images
- URL: http://arxiv.org/abs/2211.07648v1
- Date: Mon, 14 Nov 2022 08:14:47 GMT
- Title: Removing fluid lensing effects from spatial images
- Authors: Greg Sabella
- Abstract summary: Shallow water and coastal aquatic ecosystems play a critical role in regulating and understanding Earth's changing climate and biodiversity.
Yet technology used for remote sensing (drones, UAVs, satellites) cannot produce detailed images of these ecosystems.
A proof of concept model was developed that is able to remove most of these effects and produce a clearer more stable image.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Shallow water and coastal aquatic ecosystems such as coral reefs and seagrass
meadows play a critical role in regulating and understanding Earth's changing
climate and biodiversity. They also play an important role in protecting towns
and cities from erosion and storm surges. Yet technology used for remote
sensing (drones, UAVs, satellites) cannot produce detailed images of these
ecosystems. Fluid lensing effects, the distortions caused by surface waves and
light on underwater objects, are what makes the remote sensing of these
ecosystems a very challenging task. Using machine learning, a proof of concept
model was developed that is able to remove most of these effects and produce a
clearer more stable image.
Related papers
- Enhancing Marine Debris Acoustic Monitoring by Optical Flow-Based Motion Vector Analysis [0.0]
The paper proposes an optical flow-based method for marine debris monitoring.
The proposed method was validated through experiments conducted in a circulating water tank.
arXiv Detail & Related papers (2024-12-28T08:55:37Z) - Floor extraction and door detection for visually impaired guidance [78.94595951597344]
Finding obstacle-free paths in unknown environments is a big navigation issue for visually impaired people and autonomous robots.
New devices based on computer vision systems can help impaired people to overcome the difficulties of navigating in unknown environments in safe conditions.
In this work it is proposed a combination of sensors and algorithms that can lead to the building of a navigation system for visually impaired people.
arXiv Detail & Related papers (2024-01-30T14:38:43Z) - A deep learning approach for marine snow synthesis and removal [55.86191108738564]
This paper proposes a novel method to reduce the marine snow interference using deep learning techniques.
We first synthesize realistic marine snow samples by training a Generative Adversarial Network (GAN) model.
We then train a U-Net model to perform marine snow removal as an image to image translation task.
arXiv Detail & Related papers (2023-11-27T07:19:41Z) - Unpaired Overwater Image Defogging Using Prior Map Guided CycleGAN [60.257791714663725]
We propose a Prior map Guided CycleGAN (PG-CycleGAN) for defogging of images with overwater scenes.
The proposed method outperforms the state-of-the-art supervised, semi-supervised, and unsupervised defogging approaches.
arXiv Detail & Related papers (2022-12-23T03:00:28Z) - Seafloor-Invariant Caustics Removal from Underwater Imagery [0.0]
Caustics are complex physical phenomena resulting from the projection of light rays being refracted by the wavy surface.
In this work, we propose a novel method for correcting the effects of caustics on shallow underwater imagery.
In particular, the developed method employs deep learning architectures in order to classify image pixels to "non-caustics" and "caustics"
arXiv Detail & Related papers (2022-12-20T11:11:02Z) - WaterNeRF: Neural Radiance Fields for Underwater Scenes [6.161668246821327]
We advance state-of-the-art in neural radiance fields (NeRFs) to enable physics-informed dense depth estimation and color correction.
Our proposed method, WaterNeRF, estimates parameters of a physics-based model for underwater image formation.
We can produce novel views of degraded as well as corrected underwater images, along with dense depth of the scene.
arXiv Detail & Related papers (2022-09-27T00:53:26Z) - Towards Generating Large Synthetic Phytoplankton Datasets for Efficient
Monitoring of Harmful Algal Blooms [77.25251419910205]
Harmful algal blooms (HABs) cause significant fish deaths in aquaculture farms.
Currently, the standard method to enumerate harmful algae and other phytoplankton is to manually observe and count them under a microscope.
We employ Generative Adversarial Networks (GANs) to generate synthetic images.
arXiv Detail & Related papers (2022-08-03T20:15:55Z) - ClimateGAN: Raising Climate Change Awareness by Generating Images of
Floods [89.61670857155173]
We present our solution to simulate photo-realistic floods on authentic images.
We propose ClimateGAN, a model that leverages both simulated and real data for unsupervised domain adaptation and conditional image generation.
arXiv Detail & Related papers (2021-10-06T15:54:57Z) - Robustly Removing Deep Sea Lighting Effects for Visual Mapping of
Abyssal Plains [3.566117940176302]
The majority of Earth's surface lies deep in the oceans, where no surface light reaches.
Visual mapping, including image matching and surface albedo estimation, severely suffers from the effects that co-moving light sources produce.
We present a practical approach to estimating and compensating these lighting effects on predominantly homogeneous, flat seafloor regions.
arXiv Detail & Related papers (2021-10-01T15:28:07Z) - Physics-informed GANs for Coastal Flood Visualization [65.54626149826066]
We create a deep learning pipeline that generates visual satellite images of current and future coastal flooding.
By evaluating the imagery relative to physics-based flood maps, we find that our proposed framework outperforms baseline models in both physical-consistency and photorealism.
While this work focused on the visualization of coastal floods, we envision the creation of a global visualization of how climate change will shape our earth.
arXiv Detail & Related papers (2020-10-16T02:15:34Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.