Single-Temporal Supervised Learning for Universal Remote Sensing Change Detection
- URL: http://arxiv.org/abs/2406.15694v1
- Date: Sat, 22 Jun 2024 00:03:21 GMT
- Title: Single-Temporal Supervised Learning for Universal Remote Sensing Change Detection
- Authors: Zhuo Zheng, Yanfei Zhong, Ailong Ma, Liangpei Zhang,
- Abstract summary: We propose single-temporal supervised learning (STAR) for universal remote sensing change detection.
Star enables us to train a high-accuracy change detector only using unpaired labeled images.
ChangeStar2 achieves state-of-the-art performances on eight public remote sensing change detection datasets.
- Score: 21.622442722863028
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Bitemporal supervised learning paradigm always dominates remote sensing change detection using numerous labeled bitemporal image pairs, especially for high spatial resolution (HSR) remote sensing imagery. However, it is very expensive and labor-intensive to label change regions in large-scale bitemporal HSR remote sensing image pairs. In this paper, we propose single-temporal supervised learning (STAR) for universal remote sensing change detection from a new perspective of exploiting changes between unpaired images as supervisory signals. STAR enables us to train a high-accuracy change detector only using unpaired labeled images and can generalize to real-world bitemporal image pairs. To demonstrate the flexibility and scalability of STAR, we design a simple yet unified change detector, termed ChangeStar2, capable of addressing binary change detection, object change detection, and semantic change detection in one architecture. ChangeStar2 achieves state-of-the-art performances on eight public remote sensing change detection datasets, covering above two supervised settings, multiple change types, multiple scenarios. The code is available at https://github.com/Z-Zheng/pytorch-change-models.
Related papers
- Show Me What and Where has Changed? Question Answering and Grounding for Remote Sensing Change Detection [82.65760006883248]
We introduce a new task named Change Detection Question Answering and Grounding (CDQAG)
CDQAG extends the traditional change detection task by providing interpretable textual answers and intuitive visual evidence.
We construct the first CDQAG benchmark dataset, termed QAG-360K, comprising over 360K triplets of questions, textual answers, and corresponding high-quality visual masks.
arXiv Detail & Related papers (2024-10-31T11:20:13Z) - Enhancing Perception of Key Changes in Remote Sensing Image Change Captioning [49.24306593078429]
We propose a novel framework for remote sensing image change captioning, guided by Key Change Features and Instruction-tuned (KCFI)
KCFI includes a ViTs encoder for extracting bi-temporal remote sensing image features, a key feature perceiver for identifying critical change areas, and a pixel-level change detection decoder.
To validate the effectiveness of our approach, we compare it against several state-of-the-art change captioning methods on the LEVIR-CC dataset.
arXiv Detail & Related papers (2024-09-19T09:33:33Z) - Changen2: Multi-Temporal Remote Sensing Generative Change Foundation Model [62.337749660637755]
We present change data generators based on generative models which are cheap and automatic.
Changen2 is a generative change foundation model that can be trained at scale via self-supervision.
The resulting model possesses inherent zero-shot change detection capabilities and excellent transferability.
arXiv Detail & Related papers (2024-06-26T01:03:39Z) - ChangeBind: A Hybrid Change Encoder for Remote Sensing Change Detection [16.62779899494721]
Change detection (CD) is a fundamental task in remote sensing (RS) which aims to detect the semantic changes between the same geographical regions at different time stamps.
We propose an effective Siamese-based framework to encode the semantic changes occurring in the bi-temporal RS images.
arXiv Detail & Related papers (2024-04-26T17:47:14Z) - Segment Any Change [64.23961453159454]
We propose a new type of change detection model that supports zero-shot prediction and generalization on unseen change types and data distributions.
AnyChange is built on the segment anything model (SAM) via our training-free adaptation method, bitemporal latent matching.
We also propose a point query mechanism to enable AnyChange's zero-shot object-centric change detection capability.
arXiv Detail & Related papers (2024-02-02T07:17:39Z) - TransY-Net:Learning Fully Transformer Networks for Change Detection of
Remote Sensing Images [64.63004710817239]
We propose a novel Transformer-based learning framework named TransY-Net for remote sensing image CD.
It improves the feature extraction from a global view and combines multi-level visual features in a pyramid manner.
Our proposed method achieves a new state-of-the-art performance on four optical and two SAR image CD benchmarks.
arXiv Detail & Related papers (2023-10-22T07:42:19Z) - Self-Pair: Synthesizing Changes from Single Source for Object Change
Detection in Remote Sensing Imagery [6.586756080460231]
We train a change detector using two spatially unrelated images with corresponding semantic labels such as building.
We show that manipulating the source image as an after-image is crucial to the performance of change detection.
Our method outperforms existing methods based on single-temporal supervision.
arXiv Detail & Related papers (2022-12-20T13:26:42Z) - dual unet:a novel siamese network for change detection with cascade
differential fusion [4.651756476458979]
We propose a novel Siamese neural network for change detection task, namely Dual-UNet.
In contrast to previous individually encoded the bitemporal images, we design an encoder differential-attention module to focus on the spatial difference relationships of pixels.
Experiments demonstrate that the proposed approach consistently outperforms the most advanced methods on popular seasonal change detection datasets.
arXiv Detail & Related papers (2022-08-12T14:24:09Z) - Change is Everywhere: Single-Temporal Supervised Object Change Detection
in Remote Sensing Imagery [23.620965329033716]
We propose single-temporal supervised learning (STAR) for change detection from a new perspective.
We train a high-accuracy change detector only using textbfunpaired labeled images and generalize to real-world bitemporal images.
ChangeStar outperforms the baseline with a large margin under single-temporal supervision and achieves superior performance under bitemporal supervision.
arXiv Detail & Related papers (2021-08-16T10:25:15Z) - Self-supervised Multisensor Change Detection [14.191073951237772]
We revisit multisensor analysis in context of self-supervised change detection in bi-temporal satellite images.
Recent development of self-supervised learning methods has shown that some of them can even work with only few images.
Motivated by this, in this work we propose a method for multi-sensor change detection using only the unlabeled target bi-temporal images.
arXiv Detail & Related papers (2021-02-12T12:31:10Z) - D-Unet: A Dual-encoder U-Net for Image Splicing Forgery Detection and
Localization [108.8592577019391]
Image splicing forgery detection is a global binary classification task that distinguishes the tampered and non-tampered regions by image fingerprints.
We propose a novel network called dual-encoder U-Net (D-Unet) for image splicing forgery detection, which employs an unfixed encoder and a fixed encoder.
In an experimental comparison study of D-Unet and state-of-the-art methods, D-Unet outperformed the other methods in image-level and pixel-level detection.
arXiv Detail & Related papers (2020-12-03T10:54:02Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.