Bridging Remote Sensors with Multisensor Geospatial Foundation Models
- URL: http://arxiv.org/abs/2404.01260v1
- Date: Mon, 1 Apr 2024 17:30:56 GMT
- Title: Bridging Remote Sensors with Multisensor Geospatial Foundation Models
- Authors: Boran Han, Shuai Zhang, Xingjian Shi, Markus Reichstein,
- Abstract summary: msGFM is a multisensor geospatial foundation model that unifies data from four key sensor modalities.
For data originating from identical geolocations, our model employs an innovative cross-sensor pretraining approach.
msGFM has demonstrated enhanced proficiency in a range of both single-sensor and multisensor downstream tasks.
- Score: 15.289711240431107
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: In the realm of geospatial analysis, the diversity of remote sensors, encompassing both optical and microwave technologies, offers a wealth of distinct observational capabilities. Recognizing this, we present msGFM, a multisensor geospatial foundation model that effectively unifies data from four key sensor modalities. This integration spans an expansive dataset of two million multisensor images. msGFM is uniquely adept at handling both paired and unpaired sensor data. For data originating from identical geolocations, our model employs an innovative cross-sensor pretraining approach in masked image modeling, enabling the synthesis of joint representations from diverse sensors. msGFM, incorporating four remote sensors, upholds strong performance, forming a comprehensive model adaptable to various sensor types. msGFM has demonstrated enhanced proficiency in a range of both single-sensor and multisensor downstream tasks. These include scene classification, segmentation, cloud removal, and pan-sharpening. A key discovery of our research is that representations derived from natural images are not always compatible with the distinct characteristics of geospatial remote sensors, underscoring the limitations of existing representations in this field. Our work can serve as a guide for developing multisensor geospatial pretraining models, paving the way for more advanced geospatial capabilities.
Related papers
- MSSIDD: A Benchmark for Multi-Sensor Denoising [55.41612200877861]
We introduce a new benchmark, the Multi-Sensor SIDD dataset, which is the first raw-domain dataset designed to evaluate the sensor transferability of denoising models.
We propose a sensor consistency training framework that enables denoising models to learn the sensor-invariant features.
arXiv Detail & Related papers (2024-11-18T13:32:59Z) - SenPa-MAE: Sensor Parameter Aware Masked Autoencoder for Multi-Satellite Self-Supervised Pretraining [1.4528189330418977]
SenPa-MAE encodes the sensor parameters of an observed multispectral signal into the image embeddings.
SenPa-MAE can be pre-trained on imagery of different satellites with non-matching spectral or geometrical sensor characteristics.
arXiv Detail & Related papers (2024-08-20T16:53:30Z) - Increasing the Robustness of Model Predictions to Missing Sensors in Earth Observation [5.143097874851516]
We study two novel methods tailored for multi-sensor scenarios, namely Input Sensor Dropout (ISensD) and Ensemble Sensor Invariant (ESensI)
We demonstrate that these methods effectively increase the robustness of model predictions to missing sensors.
We observe that ensemble multi-sensor models are the most robust to the lack of sensors.
arXiv Detail & Related papers (2024-07-22T09:58:29Z) - Neural Plasticity-Inspired Multimodal Foundation Model for Earth Observation [48.66623377464203]
Our novel approach introduces the Dynamic One-For-All (DOFA) model, leveraging the concept of neural plasticity in brain science.
This dynamic hypernetwork, adjusting to different wavelengths, enables a single versatile Transformer jointly trained on data from five sensors to excel across 12 distinct Earth observation tasks.
arXiv Detail & Related papers (2024-03-22T17:11:47Z) - Data-Induced Interactions of Sparse Sensors [3.050919759387984]
We take a thermodynamic view to compute the full landscape of sensor interactions induced by the training data.
Mapping out these data-induced sensor interactions allows combining them with external selection criteria and anticipating sensor replacement impacts.
arXiv Detail & Related papers (2023-07-21T18:13:37Z) - Learning Online Multi-Sensor Depth Fusion [100.84519175539378]
SenFuNet is a depth fusion approach that learns sensor-specific noise and outlier statistics.
We conduct experiments with various sensor combinations on the real-world CoRBS and Scene3D datasets.
arXiv Detail & Related papers (2022-04-07T10:45:32Z) - Semantics-aware Adaptive Knowledge Distillation for Sensor-to-Vision
Action Recognition [131.6328804788164]
We propose a framework, named Semantics-aware Adaptive Knowledge Distillation Networks (SAKDN), to enhance action recognition in vision-sensor modality (videos)
The SAKDN uses multiple wearable-sensors as teacher modalities and uses RGB videos as student modality.
arXiv Detail & Related papers (2020-09-01T03:38:31Z) - Deep Soft Procrustes for Markerless Volumetric Sensor Alignment [81.13055566952221]
In this work, we improve markerless data-driven correspondence estimation to achieve more robust multi-sensor spatial alignment.
We incorporate geometric constraints in an end-to-end manner into a typical segmentation based model and bridge the intermediate dense classification task with the targeted pose estimation one.
Our model is experimentally shown to achieve similar results with marker-based methods and outperform the markerless ones, while also being robust to the pose variations of the calibration structure.
arXiv Detail & Related papers (2020-03-23T10:51:32Z) - Learning Selective Sensor Fusion for States Estimation [47.76590539558037]
We propose SelectFusion, an end-to-end selective sensor fusion module.
During prediction, the network is able to assess the reliability of the latent features from different sensor modalities.
We extensively evaluate all fusion strategies in both public datasets and on progressively degraded datasets.
arXiv Detail & Related papers (2019-12-30T20:25:16Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.