Training general representations for remote sensing using in-domain
knowledge
- URL: http://arxiv.org/abs/2010.00332v1
- Date: Wed, 30 Sep 2020 15:00:07 GMT
- Title: Training general representations for remote sensing using in-domain
knowledge
- Authors: Maxim Neumann, Andr\'e Susano Pinto, Xiaohua Zhai, and Neil Houlsby
- Abstract summary: This paper investigates development of generic remote sensing representations.
It explores which characteristics are important for a dataset to be a good source for representation learning.
- Score: 23.741188128379893
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Automatically finding good and general remote sensing representations allows
to perform transfer learning on a wide range of applications - improving the
accuracy and reducing the required number of training samples. This paper
investigates development of generic remote sensing representations, and
explores which characteristics are important for a dataset to be a good source
for representation learning. For this analysis, five diverse remote sensing
datasets are selected and used for both, disjoint upstream representation
learning and downstream model training and evaluation. A common evaluation
protocol is used to establish baselines for these datasets that achieve
state-of-the-art performance. As the results indicate, especially with a low
number of available training samples a significant performance enhancement can
be observed when including additionally in-domain data in comparison to
training models from scratch or fine-tuning only on ImageNet (up to 11% and
40%, respectively, at 100 training samples). All datasets and pretrained
representation models are published online.
Related papers
- No "Zero-Shot" Without Exponential Data: Pretraining Concept Frequency Determines Multimodal Model Performance [68.18779562801762]
multimodal models require exponentially more data to achieve linear improvements in downstream "zero-shot" performance.
Our study reveals an exponential need for training data which implies that the key to "zero-shot" generalization capabilities under large-scale training paradigms remains to be found.
arXiv Detail & Related papers (2024-04-04T17:58:02Z) - Rethinking Transformers Pre-training for Multi-Spectral Satellite
Imagery [78.43828998065071]
Recent advances in unsupervised learning have demonstrated the ability of large vision models to achieve promising results on downstream tasks.
Such pre-training techniques have also been explored recently in the remote sensing domain due to the availability of large amount of unlabelled data.
In this paper, we re-visit transformers pre-training and leverage multi-scale information that is effectively utilized with multiple modalities.
arXiv Detail & Related papers (2024-03-08T16:18:04Z) - Self-Supervised In-Domain Representation Learning for Remote Sensing
Image Scene Classification [1.0152838128195465]
Transferring the ImageNet pre-trained weights to the various remote sensing tasks has produced acceptable results.
Recent research has demonstrated that self-supervised learning methods capture visual features that are more discriminative and transferable.
We are motivated by these facts to pre-train the in-domain representations of remote sensing imagery using contrastive self-supervised learning.
arXiv Detail & Related papers (2023-02-03T15:03:07Z) - Forgetful Active Learning with Switch Events: Efficient Sampling for
Out-of-Distribution Data [13.800680101300756]
In practice, fully trained neural networks interact randomly with out-of-distribution (OOD) inputs.
We introduce forgetful active learning with switch events (FALSE) - a novel active learning protocol for out-of-distribution active learning.
We report up to 4.5% accuracy improvements in over 270 experiments.
arXiv Detail & Related papers (2023-01-12T16:03:14Z) - CHALLENGER: Training with Attribution Maps [63.736435657236505]
We show that utilizing attribution maps for training neural networks can improve regularization of models and thus increase performance.
In particular, we show that our generic domain-independent approach yields state-of-the-art results in vision, natural language processing and on time series tasks.
arXiv Detail & Related papers (2022-05-30T13:34:46Z) - Multi-Domain Joint Training for Person Re-Identification [51.73921349603597]
Deep learning-based person Re-IDentification (ReID) often requires a large amount of training data to achieve good performance.
It appears that collecting more training data from diverse environments tends to improve the ReID performance.
We propose an approach called Domain-Camera-Sample Dynamic network (DCSD) whose parameters can be adaptive to various factors.
arXiv Detail & Related papers (2022-01-06T09:20:59Z) - Self-supervised Audiovisual Representation Learning for Remote Sensing Data [96.23611272637943]
We propose a self-supervised approach for pre-training deep neural networks in remote sensing.
By exploiting the correspondence between geo-tagged audio recordings and remote sensing, this is done in a completely label-free manner.
We show that our approach outperforms existing pre-training strategies for remote sensing imagery.
arXiv Detail & Related papers (2021-08-02T07:50:50Z) - Omni-supervised Facial Expression Recognition via Distilled Data [120.11782405714234]
We propose omni-supervised learning to exploit reliable samples in a large amount of unlabeled data for network training.
We experimentally verify that the new dataset can significantly improve the ability of the learned FER model.
To tackle this, we propose to apply a dataset distillation strategy to compress the created dataset into several informative class-wise images.
arXiv Detail & Related papers (2020-05-18T09:36:51Z) - The Utility of Feature Reuse: Transfer Learning in Data-Starved Regimes [6.419457653976053]
We describe a transfer learning use case for a domain with a data-starved regime.
We evaluate the effectiveness of convolutional feature extraction and fine-tuning.
We conclude that transfer learning enhances the performance of CNN architectures in data-starved regimes.
arXiv Detail & Related papers (2020-02-29T18:48:58Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.