Identifying every building's function in large-scale urban areas with multi-modality remote-sensing data
- URL: http://arxiv.org/abs/2405.05133v1
- Date: Wed, 8 May 2024 15:32:20 GMT
- Title: Identifying every building's function in large-scale urban areas with multi-modality remote-sensing data
- Authors: Zhuohong Li, Wei He, Jiepan Li, Hongyan Zhang,
- Abstract summary: This study proposes a semi-supervised framework to identify every building's function in large-scale urban areas.
optical images, building height, and nighttime-light data are collected to describe the morphological attributes of buildings.
Results are evaluated by 20,000 validation points and statistical survey reports from the government.
- Score: 5.18540804614798
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Buildings, as fundamental man-made structures in urban environments, serve as crucial indicators for understanding various city function zones. Rapid urbanization has raised an urgent need for efficiently surveying building footprints and functions. In this study, we proposed a semi-supervised framework to identify every building's function in large-scale urban areas with multi-modality remote-sensing data. In detail, optical images, building height, and nighttime-light data are collected to describe the morphological attributes of buildings. Then, the area of interest (AOI) and building masks from the volunteered geographic information (VGI) data are collected to form sparsely labeled samples. Furthermore, the multi-modality data and weak labels are utilized to train a segmentation model with a semi-supervised strategy. Finally, results are evaluated by 20,000 validation points and statistical survey reports from the government. The evaluations reveal that the produced function maps achieve an OA of 82% and Kappa of 71% among 1,616,796 buildings in Shanghai, China. This study has the potential to support large-scale urban management and sustainable urban development. All collected data and produced maps are open access at https://github.com/LiZhuoHong/BuildingMap.
Related papers
- GBSS:a global building semantic segmentation dataset for large-scale
remote sensing building extraction [10.39943244036649]
We construct a Global Building Semantic dataset (The dataset will be released), which comprises 116.9k pairs of samples (about 742k buildings) from six continents.
There are significant variations of building samples in terms of size and style, so the dataset can be a more challenging benchmark for evaluating the generalization and robustness of building semantic segmentation models.
arXiv Detail & Related papers (2024-01-02T12:13:35Z) - City Foundation Models for Learning General Purpose Representations from
OpenStreetMap [17.577683270277173]
We present CityFM, a framework to train a foundation model within a selected geographical area of interest, such as a city.
CityFM relies solely on open data from OpenStreetMap, and produces multimodal representations of entities of different types, spatial, visual, and textual information.
In all the experiments, CityFM achieves performance superior to, or on par with, the baselines.
arXiv Detail & Related papers (2023-10-01T05:55:30Z) - Cross-City Matters: A Multimodal Remote Sensing Benchmark Dataset for
Cross-City Semantic Segmentation using High-Resolution Domain Adaptation
Networks [82.82866901799565]
We build a new set of multimodal remote sensing benchmark datasets (including hyperspectral, multispectral, SAR) for the study purpose of the cross-city semantic segmentation task.
Beyond the single city, we propose a high-resolution domain adaptation network, HighDAN, to promote the AI model's generalization ability from the multi-city environments.
HighDAN is capable of retaining the spatially topological structure of the studied urban scene well in a parallel high-to-low resolution fusion fashion.
arXiv Detail & Related papers (2023-09-26T23:55:39Z) - Unified Data Management and Comprehensive Performance Evaluation for
Urban Spatial-Temporal Prediction [Experiment, Analysis & Benchmark] [78.05103666987655]
This work addresses challenges in accessing and utilizing diverse urban spatial-temporal datasets.
We introduceatomic files, a unified storage format designed for urban spatial-temporal big data, and validate its effectiveness on 40 diverse datasets.
We conduct extensive experiments using diverse models and datasets, establishing a performance leaderboard and identifying promising research directions.
arXiv Detail & Related papers (2023-08-24T16:20:00Z) - Semi-supervised Learning from Street-View Images and OpenStreetMap for
Automatic Building Height Estimation [59.6553058160943]
We propose a semi-supervised learning (SSL) method of automatically estimating building height from Mapillary SVI and OpenStreetMap data.
The proposed method leads to a clear performance boosting in estimating building heights with a Mean Absolute Error (MAE) around 2.1 meters.
The preliminary result is promising and motivates our future work in scaling up the proposed method based on low-cost VGI data.
arXiv Detail & Related papers (2023-07-05T18:16:30Z) - Building Floorspace in China: A Dataset and Learning Pipeline [0.32228025627337864]
This paper provides a first milestone in measuring the floorspace of buildings in 40 major Chinese cities.
We use Sentinel-1 and -2 satellite images as our main data source.
We provide a detailed description of our data, algorithms, and evaluations.
arXiv Detail & Related papers (2023-03-03T21:45:36Z) - Building Coverage Estimation with Low-resolution Remote Sensing Imagery [65.95520230761544]
We propose a method for estimating building coverage using only publicly available low-resolution satellite imagery.
Our model achieves a coefficient of determination as high as 0.968 on predicting building coverage in regions of different levels of development around the world.
arXiv Detail & Related papers (2023-01-04T05:19:33Z) - Mapping Vulnerable Populations with AI [23.732584273099054]
Building functions shall be retrieved by parsing social media data like for instance tweets, as well as ground-based imagery.
Building maps augmented with those additional attributes make it possible to derive more accurate population density maps.
arXiv Detail & Related papers (2021-07-29T15:52:11Z) - Open government geospatial data on buildings for planning sustainable
and resilient cities [0.0]
We conduct a global study of 2D geospatial data on buildings that are released by governments for free access.
We benchmark more than 140 releases from 28 countries containing above 100 million buildings, based on five dimensions: accessibility, richness, data quality, harmonisation, and relationships with other actors.
We find that much building data released by governments is valuable for spatial analyses, but there are large disparities among them and not all instances are of high quality, harmonised, and rich in descriptive information.
arXiv Detail & Related papers (2021-06-28T17:13:04Z) - Methodological Foundation of a Numerical Taxonomy of Urban Form [62.997667081978825]
We present a method for numerical taxonomy of urban form derived from biological systematics.
We derive homogeneous urban tissue types and, by determining overall morphological similarity between them, generate a hierarchical classification of urban form.
After framing and presenting the method, we test it on two cities - Prague and Amsterdam.
arXiv Detail & Related papers (2021-04-30T12:47:52Z) - CNN-based Density Estimation and Crowd Counting: A Survey [65.06491415951193]
This paper comprehensively studies the crowd counting models, mainly CNN-based density map estimation methods.
According to the evaluation metrics, we select the top three performers on their crowd counting datasets.
We expect to make reasonable inference and prediction for the future development of crowd counting.
arXiv Detail & Related papers (2020-03-28T13:17:30Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.