Resampling Augmentation for Time Series Contrastive Learning: Application to Remote Sensing
- URL: http://arxiv.org/abs/2506.18587v1
- Date: Mon, 23 Jun 2025 12:48:19 GMT
- Title: Resampling Augmentation for Time Series Contrastive Learning: Application to Remote Sensing
- Authors: Antoine Saget, Baptiste Lafabregue, Antoine Cornuéjols, Pierre Gançarski,
- Abstract summary: We introduce a novel resampling-based augmentation strategy that generates positive pairs by upsampling time series.<n>We validate our approach on multiple agricultural classification benchmarks using Sentinel-2 imagery.<n>Our method offers a simple, yet effective, contrastive learning augmentation for remote sensing time series.
- Score: 0.7274730603514222
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Given the abundance of unlabeled Satellite Image Time Series (SITS) and the scarcity of labeled data, contrastive self-supervised pretraining emerges as a natural tool to leverage this vast quantity of unlabeled data. However, designing effective data augmentations for contrastive learning remains challenging for time series. We introduce a novel resampling-based augmentation strategy that generates positive pairs by upsampling time series and extracting disjoint subsequences while preserving temporal coverage. We validate our approach on multiple agricultural classification benchmarks using Sentinel-2 imagery, showing that it outperforms common alternatives such as jittering, resizing, and masking. Further, we achieve state-of-the-art performance on the S2-Agri100 dataset without employing spatial information or temporal encodings, surpassing more complex masked-based SSL frameworks. Our method offers a simple, yet effective, contrastive learning augmentation for remote sensing time series.
Related papers
- FreRA: A Frequency-Refined Augmentation for Contrastive Learning on Time Series Classification [56.925103708982164]
We present a novel perspective from the frequency domain and identify three advantages for downstream classification: global, independent, and compact.<n>We propose the lightweight yet effective Frequency Refined Augmentation (FreRA) tailored for time series contrastive learning on classification tasks.<n>FreRA consistently outperforms ten leading baselines on time series classification, anomaly detection, and transfer learning tasks.
arXiv Detail & Related papers (2025-05-29T07:18:28Z) - Augmented Contrastive Clustering with Uncertainty-Aware Prototyping for Time Series Test Time Adaptation [28.793983148042134]
Test-time adaptation aims to adapt pre-trained deep neural networks using solely online unlabelled test data during inference.<n>Existing TTA methods, originally designed for visual tasks, may not effectively handle the complex temporal dynamics of real-world time series data.<n>We propose Augmented Contrastive Clustering with Uncertainty-aware Prototyping (ACCUP), a straightforward yet effective TTA method for time series data.
arXiv Detail & Related papers (2025-01-01T11:45:17Z) - Distillation Enhanced Time Series Forecasting Network with Momentum Contrastive Learning [7.4106801792345705]
We propose DE-TSMCL, an innovative distillation enhanced framework for long sequence time series forecasting.
Specifically, we design a learnable data augmentation mechanism which adaptively learns whether to mask a timestamp.
Then, we propose a contrastive learning task with momentum update to explore inter-sample and intra-temporal correlations of time series.
By developing model loss from multiple tasks, we can learn effective representations for downstream forecasting task.
arXiv Detail & Related papers (2024-01-31T12:52:10Z) - Graph-Aware Contrasting for Multivariate Time-Series Classification [50.84488941336865]
Existing contrastive learning methods mainly focus on achieving temporal consistency with temporal augmentation and contrasting techniques.
We propose Graph-Aware Contrasting for spatial consistency across MTS data.
Our proposed method achieves state-of-the-art performance on various MTS classification tasks.
arXiv Detail & Related papers (2023-09-11T02:35:22Z) - Large-scale Fully-Unsupervised Re-Identification [78.47108158030213]
We propose two strategies to learn from large-scale unlabeled data.
The first strategy performs a local neighborhood sampling to reduce the dataset size in each without violating neighborhood relationships.
A second strategy leverages a novel Re-Ranking technique, which has a lower time upper bound complexity and reduces the memory complexity from O(n2) to O(kn) with k n.
arXiv Detail & Related papers (2023-07-26T16:19:19Z) - Unsupervised CD in satellite image time series by contrastive learning
and feature tracking [15.148034487267635]
We propose a two-stage approach to unsupervised change detection in satellite image time-series using contrastive learning with feature tracking.
By deriving pseudo labels from pre-trained models and using feature tracking to propagate them among the image time-series, we improve the consistency of our pseudo labels and address the challenges of seasonal changes in long-term remote sensing image time-series.
arXiv Detail & Related papers (2023-04-22T11:19:19Z) - Time Series Contrastive Learning with Information-Aware Augmentations [57.45139904366001]
A key component of contrastive learning is to select appropriate augmentations imposing some priors to construct feasible positive samples.
How to find the desired augmentations of time series data that are meaningful for given contrastive learning tasks and datasets remains an open question.
We propose a new contrastive learning approach with information-aware augmentations, InfoTS, that adaptively selects optimal augmentations for time series representation learning.
arXiv Detail & Related papers (2023-03-21T15:02:50Z) - SatMAE: Pre-training Transformers for Temporal and Multi-Spectral
Satellite Imagery [74.82821342249039]
We present SatMAE, a pre-training framework for temporal or multi-spectral satellite imagery based on Masked Autoencoder (MAE)
To leverage temporal information, we include a temporal embedding along with independently masking image patches across time.
arXiv Detail & Related papers (2022-07-17T01:35:29Z) - Mixing Up Contrastive Learning: Self-Supervised Representation Learning
for Time Series [22.376529167056376]
We propose an unsupervised contrastive learning framework motivated from the perspective of label smoothing.
The proposed approach uses a novel contrastive loss that naturally exploits a data augmentation scheme.
Experiments demonstrate the framework's superior performance compared to other representation learning approaches.
arXiv Detail & Related papers (2022-03-17T11:49:21Z) - Semi-supervised Facial Action Unit Intensity Estimation with Contrastive
Learning [54.90704746573636]
Our method does not require to manually select key frames, and produces state-of-the-art results with as little as $2%$ of annotated frames.
We experimentally validate that our method outperforms existing methods when working with as little as $2%$ of randomly chosen data.
arXiv Detail & Related papers (2020-11-03T17:35:57Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.