Cloud-Aware SAR Fusion for Enhanced Optical Sensing in Space Missions
- URL: http://arxiv.org/abs/2506.17885v1
- Date: Sun, 22 Jun 2025 03:27:41 GMT
- Title: Cloud-Aware SAR Fusion for Enhanced Optical Sensing in Space Missions
- Authors: Trong-An Bui, Thanh-Thoai Le,
- Abstract summary: Cloud contamination significantly impairs the usability of optical satellite imagery.<n>This research presents a Cloud-Attentive Reconstruction Framework that integrates SAR-optical feature fusion with deep learning-based image reconstruction.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Cloud contamination significantly impairs the usability of optical satellite imagery, affecting critical applications such as environmental monitoring, disaster response, and land-use analysis. This research presents a Cloud-Attentive Reconstruction Framework that integrates SAR-optical feature fusion with deep learning-based image reconstruction to generate cloud-free optical imagery. The proposed framework employs an attention-driven feature fusion mechanism to align complementary structural information from Synthetic Aperture Radar (SAR) with spectral characteristics from optical data. Furthermore, a cloud-aware model update strategy introduces adaptive loss weighting to prioritize cloud-occluded regions, enhancing reconstruction accuracy. Experimental results demonstrate that the proposed method outperforms existing approaches, achieving a PSNR of 31.01 dB, SSIM of 0.918, and MAE of 0.017. These outcomes highlight the framework's effectiveness in producing high-fidelity, spatially and spectrally consistent cloud-free optical images.
Related papers
- STAR: A Benchmark for Astronomical Star Fields Super-Resolution [51.79340280382437]
We propose STAR, a large-scale astronomical SR dataset containing 54,738 flux-consistent star field image pairs.<n>We propose a Flux-Invariant Super Resolution (FISR) model that could accurately infer the flux-consistent high-resolution images from input photometry.
arXiv Detail & Related papers (2025-07-22T09:28:28Z) - High-Quality Cloud-Free Optical Image Synthesis Using Multi-Temporal SAR and Contaminated Optical Data [1.5410557873153836]
This paper tackles the challenges of missing optical data synthesis, particularly in complex scenarios with cloud cover.<n>We propose CR SynthNet, a novel image synthesis network that incorporates innovative designed modules such as the DownUp Block and Fusion Attention to enhance accuracy.<n> Experimental results validate the effectiveness of CR SynthNet, demonstrating substantial improvements in restoring structural details, preserving spectral consist, and achieving superior visual effects that far exceed those produced by comparison methods.
arXiv Detail & Related papers (2025-04-23T16:44:53Z) - Enhanced Confocal Laser Scanning Microscopy with Adaptive Physics Informed Deep Autoencoders [0.0]
We present a physics-informed deep learning framework to address limitations in Confocal Laser Scanning Microscopy.<n>The model reconstructs high fidelity images from heavily noisy inputs by using convolutional and transposed convolutional layers.
arXiv Detail & Related papers (2025-01-24T18:32:34Z) - Cloud Removal With PolSAR-Optical Data Fusion Using A Two-Flow Residual Network [9.529237717137121]
Reconstructing cloud-free optical images has become a major task in recent years.<n>This paper presents a two-flow Polarimetric Synthetic Aperture Radar (PolSAR)-Optical data fusion cloud removal algorithm.
arXiv Detail & Related papers (2025-01-14T07:35:14Z) - Rethinking High-speed Image Reconstruction Framework with Spike Camera [48.627095354244204]
Spike cameras generate continuous spike streams to capture high-speed scenes with lower bandwidth and higher dynamic range than traditional RGB cameras.<n>We introduce a novel spike-to-image reconstruction framework SpikeCLIP that goes beyond traditional training paradigms.<n>Our experiments on real-world low-light datasets demonstrate that SpikeCLIP significantly enhances texture details and the luminance balance of recovered images.
arXiv Detail & Related papers (2025-01-08T13:00:17Z) - Physics-Inspired Degradation Models for Hyperspectral Image Fusion [61.743696362028246]
Most fusion methods solely focus on the fusion algorithm itself and overlook the degradation models.
We propose physics-inspired degradation models (PIDM) to model the degradation of LR-HSI and HR-MSI.
Our proposed PIDM can boost the fusion performance of existing fusion methods in practical scenarios.
arXiv Detail & Related papers (2024-02-04T09:07:28Z) - Diffusion Enhancement for Cloud Removal in Ultra-Resolution Remote
Sensing Imagery [48.14610248492785]
Cloud layers severely compromise the quality and effectiveness of optical remote sensing (RS) images.
Existing deep-learning (DL)-based Cloud Removal (CR) techniques encounter difficulties in accurately reconstructing the original visual authenticity and detailed semantic content of the images.
This work proposes enhancements at the data and methodology fronts to tackle this challenge.
arXiv Detail & Related papers (2024-01-25T13:14:17Z) - Physics-Driven Turbulence Image Restoration with Stochastic Refinement [80.79900297089176]
Image distortion by atmospheric turbulence is a critical problem in long-range optical imaging systems.
Fast and physics-grounded simulation tools have been introduced to help the deep-learning models adapt to real-world turbulence conditions.
This paper proposes the Physics-integrated Restoration Network (PiRN) to help the network to disentangle theity from the degradation and the underlying image.
arXiv Detail & Related papers (2023-07-20T05:49:21Z) - SiNeRF: Sinusoidal Neural Radiance Fields for Joint Pose Estimation and
Scene Reconstruction [147.9379707578091]
NeRFmm is the Neural Radiance Fields (NeRF) that deal with Joint Optimization tasks.
Despite NeRFmm producing precise scene synthesis and pose estimations, it still struggles to outperform the full-annotated baseline on challenging scenes.
We propose Sinusoidal Neural Radiance Fields (SiNeRF) that leverage sinusoidal activations for radiance mapping and a novel Mixed Region Sampling (MRS) for selecting ray batch efficiently.
arXiv Detail & Related papers (2022-10-10T10:47:51Z) - SAR Despeckling using a Denoising Diffusion Probabilistic Model [52.25981472415249]
The presence of speckle degrades the image quality and adversely affects the performance of SAR image understanding applications.
We introduce SAR-DDPM, a denoising diffusion probabilistic model for SAR despeckling.
The proposed method achieves significant improvements in both quantitative and qualitative results over the state-of-the-art despeckling methods.
arXiv Detail & Related papers (2022-06-09T14:00:26Z) - Exploring the Potential of SAR Data for Cloud Removal in Optical
Satellite Imagery [41.40522618945897]
We propose a novel global-local fusion based cloud removal (GLF-CR) algorithm to leverage the complementary information embedded in SAR images.
The proposed algorithm can yield high quality cloud-free images and performs favorably against state-of-the-art cloud removal algorithms.
arXiv Detail & Related papers (2022-06-06T18:53:19Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.