STNet: Spatial and Temporal feature fusion network for change detection
in remote sensing images
- URL: http://arxiv.org/abs/2304.11422v1
- Date: Sat, 22 Apr 2023 14:40:41 GMT
- Title: STNet: Spatial and Temporal feature fusion network for change detection
in remote sensing images
- Authors: Xiaowen Ma, Jiawei Yang, Tingfeng Hong, Mengting Ma, Ziyan Zhao, Tian
Feng and Wei Zhang
- Abstract summary: We propose STNet, a remote sensing change detection network based on spatial and temporal feature fusions.
Experimental results on three benchmark datasets for RSCD demonstrate that the proposed method achieves the state-of-the-art performance.
- Score: 5.258365841490956
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: As an important task in remote sensing image analysis, remote sensing change
detection (RSCD) aims to identify changes of interest in a region from
spatially co-registered multi-temporal remote sensing images, so as to monitor
the local development. Existing RSCD methods usually formulate RSCD as a binary
classification task, representing changes of interest by merely feature
concatenation or feature subtraction and recovering the spatial details via
densely connected change representations, whose performances need further
improvement. In this paper, we propose STNet, a RSCD network based on spatial
and temporal feature fusions. Specifically, we design a temporal feature fusion
(TFF) module to combine bi-temporal features using a cross-temporal gating
mechanism for emphasizing changes of interest; a spatial feature fusion module
is deployed to capture fine-grained information using a cross-scale attention
mechanism for recovering the spatial details of change representations.
Experimental results on three benchmark datasets for RSCD demonstrate that the
proposed method achieves the state-of-the-art performance. Code is available at
https://github.com/xwmaxwma/rschange.
Related papers
- Wavelet-based Bi-dimensional Aggregation Network for SAR Image Change Detection [53.842568573251214]
Experimental results on three SAR datasets demonstrate that our WBANet significantly outperforms contemporary state-of-the-art methods.
Our WBANet achieves 98.33%, 96.65%, and 96.62% of percentage of correct classification (PCC) on the respective datasets.
arXiv Detail & Related papers (2024-07-18T04:36:10Z) - Relating CNN-Transformer Fusion Network for Change Detection [23.025190360146635]
RCTNet introduces an early fusion backbone to exploit both spatial and temporal features.
Experiments demonstrate RCTNet's clear superiority over traditional RS image CD methods.
arXiv Detail & Related papers (2024-07-03T14:58:40Z) - DDLNet: Boosting Remote Sensing Change Detection with Dual-Domain Learning [5.932234366793244]
Change sensing (RSCD) aims to identify the changes of interest in a region by analyzing multi-temporal remote sensing images.
Existing RSCD methods are devoted to contextual modeling in the spatial domain to enhance the changes of interest.
We propose DNet, a RSCD network based on dual-domain learning (i.e. frequency and spatial domains)
arXiv Detail & Related papers (2024-06-19T14:54:09Z) - SRC-Net: Bi-Temporal Spatial Relationship Concerned Network for Change Detection [9.682463974799893]
Change detection (CD) in remote sensing imagery is a crucial task with applications in environmental monitoring, urban development, and disaster management.
We propose SRC-Net: a bi-temporal spatial relationship concerned network for CD.
arXiv Detail & Related papers (2024-06-09T06:53:39Z) - ChangeBind: A Hybrid Change Encoder for Remote Sensing Change Detection [16.62779899494721]
Change detection (CD) is a fundamental task in remote sensing (RS) which aims to detect the semantic changes between the same geographical regions at different time stamps.
We propose an effective Siamese-based framework to encode the semantic changes occurring in the bi-temporal RS images.
arXiv Detail & Related papers (2024-04-26T17:47:14Z) - ELGC-Net: Efficient Local-Global Context Aggregation for Remote Sensing Change Detection [65.59969454655996]
We propose an efficient change detection framework, ELGC-Net, which leverages rich contextual information to precisely estimate change regions.
Our proposed ELGC-Net sets a new state-of-the-art performance in remote sensing change detection benchmarks.
We also introduce ELGC-Net-LW, a lighter variant with significantly reduced computational complexity, suitable for resource-constrained settings.
arXiv Detail & Related papers (2024-03-26T17:46:25Z) - TransY-Net:Learning Fully Transformer Networks for Change Detection of
Remote Sensing Images [64.63004710817239]
We propose a novel Transformer-based learning framework named TransY-Net for remote sensing image CD.
It improves the feature extraction from a global view and combines multi-level visual features in a pyramid manner.
Our proposed method achieves a new state-of-the-art performance on four optical and two SAR image CD benchmarks.
arXiv Detail & Related papers (2023-10-22T07:42:19Z) - Remote Sensing Image Change Detection with Graph Interaction [1.8579693774597708]
We propose a bitemporal image graph Interaction network for remote sensing change detection, namely BGINet-CD.
Our model demonstrates superior performance compared to other state-of-the-art methods (SOTA) on the GZ CD dataset.
arXiv Detail & Related papers (2023-07-05T03:32:49Z) - MD-CSDNetwork: Multi-Domain Cross Stitched Network for Deepfake
Detection [80.83725644958633]
Current deepfake generation methods leave discriminative artifacts in the frequency spectrum of fake images and videos.
We present a novel approach, termed as MD-CSDNetwork, for combining the features in the spatial and frequency domains to mine a shared discriminative representation.
arXiv Detail & Related papers (2021-09-15T14:11:53Z) - DS-Net: Dynamic Spatiotemporal Network for Video Salient Object
Detection [78.04869214450963]
We propose a novel dynamic temporal-temporal network (DSNet) for more effective fusion of temporal and spatial information.
We show that the proposed method achieves superior performance than state-of-the-art algorithms.
arXiv Detail & Related papers (2020-12-09T06:42:30Z) - Co-Saliency Spatio-Temporal Interaction Network for Person
Re-Identification in Videos [85.6430597108455]
We propose a novel Co-Saliency Spatio-Temporal Interaction Network (CSTNet) for person re-identification in videos.
It captures the common salient foreground regions among video frames and explores the spatial-temporal long-range context interdependency from such regions.
Multiple spatialtemporal interaction modules within CSTNet are proposed, which exploit the spatial and temporal long-range context interdependencies on such features and spatial-temporal information correlation.
arXiv Detail & Related papers (2020-04-10T10:23:58Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.