Self-supervised Multisensor Change Detection
- URL: http://arxiv.org/abs/2103.05102v1
- Date: Fri, 12 Feb 2021 12:31:10 GMT
- Title: Self-supervised Multisensor Change Detection
- Authors: Sudipan Saha, Patrick Ebel, Xiao Xiang Zhu
- Abstract summary: We revisit multisensor analysis in context of self-supervised change detection in bi-temporal satellite images.
Recent development of self-supervised learning methods has shown that some of them can even work with only few images.
Motivated by this, in this work we propose a method for multi-sensor change detection using only the unlabeled target bi-temporal images.
- Score: 14.191073951237772
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Multimodal and multisensor data analysis is a long-standing goal in machine
learning research. In this paper we revisit multisensor analysis in context of
self-supervised change detection in bi-temporal satellite images. Most change
detection methods assume that pre-change and post-change images are acquired by
the same sensor. However, in many real-life scenarios, e.g., natural disaster,
it is more practical to use the latest available images before and after the
occurrence of incidence, which may be acquired using different sensors. In
particular, we are interested in the combination of the images acquired by
optical and Synthetic Aperture Radar (SAR) sensors. While optical images are
like the natural images dealt in computer vision, SAR images appear vastly
different even when capturing the same scene. Adding to this, change detection
methods are often constrained to use only target image-pair, no labeled data,
and no additional unlabeled data. Such constraints limit the scope of
traditional supervised machine learning and unsupervised generative approaches
for multi-sensor change detection. Recent rapid development of self-supervised
learning methods has shown that some of them can even work with only few
images. Motivated by this, in this work we propose a method for multi-sensor
change detection using only the unlabeled target bi-temporal images that are
used for training a network in self-supervised fashion by using deep clustering
and contrastive learning. The trained network is evaluated on multi-modal
satellite data showing change and the benefits of our self-supervised approach
are demonstrated.
Related papers
- Zero-Shot Detection of AI-Generated Images [54.01282123570917]
We propose a zero-shot entropy-based detector (ZED) to detect AI-generated images.
Inspired by recent works on machine-generated text detection, our idea is to measure how surprising the image under analysis is compared to a model of real images.
ZED achieves an average improvement of more than 3% over the SoTA in terms of accuracy.
arXiv Detail & Related papers (2024-09-24T08:46:13Z) - Single-Temporal Supervised Learning for Universal Remote Sensing Change Detection [21.622442722863028]
We propose single-temporal supervised learning (STAR) for universal remote sensing change detection.
Star enables us to train a high-accuracy change detector only using unpaired labeled images.
ChangeStar2 achieves state-of-the-art performances on eight public remote sensing change detection datasets.
arXiv Detail & Related papers (2024-06-22T00:03:21Z) - Exchange means change: an unsupervised single-temporal change detection
framework based on intra- and inter-image patch exchange [44.845959222180866]
We propose an unsupervised single-temporal CD framework based on intra- and inter-image patch exchange (I3PE)
The I3PE framework allows for training deep change detectors on unpaired and unlabeled single-temporal remote sensing images.
I3PE outperforms representative unsupervised approaches and achieves F1 value improvements of 10.65% and 6.99% to the SOTA method.
arXiv Detail & Related papers (2023-10-01T14:50:54Z) - Self-Pair: Synthesizing Changes from Single Source for Object Change
Detection in Remote Sensing Imagery [6.586756080460231]
We train a change detector using two spatially unrelated images with corresponding semantic labels such as building.
We show that manipulating the source image as an after-image is crucial to the performance of change detection.
Our method outperforms existing methods based on single-temporal supervision.
arXiv Detail & Related papers (2022-12-20T13:26:42Z) - dual unet:a novel siamese network for change detection with cascade
differential fusion [4.651756476458979]
We propose a novel Siamese neural network for change detection task, namely Dual-UNet.
In contrast to previous individually encoded the bitemporal images, we design an encoder differential-attention module to focus on the spatial difference relationships of pixels.
Experiments demonstrate that the proposed approach consistently outperforms the most advanced methods on popular seasonal change detection datasets.
arXiv Detail & Related papers (2022-08-12T14:24:09Z) - Revisiting Consistency Regularization for Semi-supervised Change
Detection in Remote Sensing Images [60.89777029184023]
We propose a semi-supervised CD model in which we formulate an unsupervised CD loss in addition to the supervised Cross-Entropy (CE) loss.
Experiments conducted on two publicly available CD datasets show that the proposed semi-supervised CD method can reach closer to the performance of supervised CD.
arXiv Detail & Related papers (2022-04-18T17:59:01Z) - Learning Enriched Illuminants for Cross and Single Sensor Color
Constancy [182.4997117953705]
We propose cross-sensor self-supervised training to train the network.
We train the network by randomly sampling the artificial illuminants in a sensor-independent manner.
Experiments show that our cross-sensor model and single-sensor model outperform other state-of-the-art methods by a large margin.
arXiv Detail & Related papers (2022-03-21T15:45:35Z) - CD-GAN: a robust fusion-based generative adversarial network for
unsupervised remote sensing change detection with heterogeneous sensors [15.284275261487114]
This paper proposes a novel unsupervised change detection method dedicated to images acquired by heterogeneous optical sensors.
It capitalizes on recent advances which formulate the change detection task into a robust fusion framework.
A comparison with state-of-the-art change detection methods demonstrates the versatility and the effectiveness of the proposed approach.
arXiv Detail & Related papers (2022-03-02T08:58:06Z) - Seasonal Contrast: Unsupervised Pre-Training from Uncurated Remote
Sensing Data [64.40187171234838]
Seasonal Contrast (SeCo) is an effective pipeline to leverage unlabeled data for in-domain pre-training of re-mote sensing representations.
SeCo will be made public to facilitate transfer learning and enable rapid progress in re-mote sensing applications.
arXiv Detail & Related papers (2021-03-30T18:26:39Z) - D-Unet: A Dual-encoder U-Net for Image Splicing Forgery Detection and
Localization [108.8592577019391]
Image splicing forgery detection is a global binary classification task that distinguishes the tampered and non-tampered regions by image fingerprints.
We propose a novel network called dual-encoder U-Net (D-Unet) for image splicing forgery detection, which employs an unfixed encoder and a fixed encoder.
In an experimental comparison study of D-Unet and state-of-the-art methods, D-Unet outperformed the other methods in image-level and pixel-level detection.
arXiv Detail & Related papers (2020-12-03T10:54:02Z) - Semantics-aware Adaptive Knowledge Distillation for Sensor-to-Vision
Action Recognition [131.6328804788164]
We propose a framework, named Semantics-aware Adaptive Knowledge Distillation Networks (SAKDN), to enhance action recognition in vision-sensor modality (videos)
The SAKDN uses multiple wearable-sensors as teacher modalities and uses RGB videos as student modality.
arXiv Detail & Related papers (2020-09-01T03:38:31Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.