Geometry-Guided Street-View Panorama Synthesis from Satellite Imagery
- URL: http://arxiv.org/abs/2103.01623v1
- Date: Tue, 2 Mar 2021 10:27:05 GMT
- Title: Geometry-Guided Street-View Panorama Synthesis from Satellite Imagery
- Authors: Yujiao Shi, Dylan Campbell, Xin Yu, Hongdong Li
- Abstract summary: We present a new approach for synthesizing a novel street-view panorama given an overhead satellite image.
Our method generates a Google's omnidirectional street-view type panorama, as if it is captured from the same geographical location as the center of the satellite patch.
- Score: 80.6282101835164
- License: http://creativecommons.org/publicdomain/zero/1.0/
- Abstract: This paper presents a new approach for synthesizing a novel street-view
panorama given an overhead satellite image. Taking a small satellite image
patch as input, our method generates a Google's omnidirectional street-view
type panorama, as if it is captured from the same geographical location as the
center of the satellite patch. Existing works tackle this task as an image
generation problem which adopts generative adversarial networks to implicitly
learn the cross-view transformations, while ignoring the domain relevance. In
this paper, we propose to explicitly establish the geometric correspondences
between the two-view images so as to facilitate the cross-view transformation
learning. Specifically, we observe that when a 3D point in the real world is
visible in both views, there is a deterministic mapping between the projected
points in the two-view images given the height information of this 3D point.
Motivated by this, we develop a novel Satellite to Street-view image Projection
(S2SP) module which explicitly establishes such geometric correspondences and
projects the satellite images to the street viewpoint. With these projected
satellite images as network input, we next employ a generator to synthesize
realistic street-view panoramas that are geometrically consistent with the
satellite images. Our S2SP module is differentiable and the whole framework is
trained in an end-to-end manner. Extensive experimental results on two
cross-view benchmark datasets demonstrate that our method generates images that
better respect the scene geometry than existing approaches.
Related papers
- CrossViewDiff: A Cross-View Diffusion Model for Satellite-to-Street View Synthesis [54.852701978617056]
CrossViewDiff is a cross-view diffusion model for satellite-to-street view synthesis.
To address the challenges posed by the large discrepancy across views, we design the satellite scene structure estimation and cross-view texture mapping modules.
To achieve a more comprehensive evaluation of the synthesis results, we additionally design a GPT-based scoring method.
arXiv Detail & Related papers (2024-08-27T03:41:44Z) - Sat2Scene: 3D Urban Scene Generation from Satellite Images with Diffusion [77.34078223594686]
We propose a novel architecture for direct 3D scene generation by introducing diffusion models into 3D sparse representations and combining them with neural rendering techniques.
Specifically, our approach generates texture colors at the point level for a given geometry using a 3D diffusion model first, which is then transformed into a scene representation in a feed-forward manner.
Experiments in two city-scale datasets show that our model demonstrates proficiency in generating photo-realistic street-view image sequences and cross-view urban scenes from satellite imagery.
arXiv Detail & Related papers (2024-01-19T16:15:37Z) - Sat2Density: Faithful Density Learning from Satellite-Ground Image Pairs [32.4349978810128]
This paper aims to develop an accurate 3D geometry representation of satellite images using satellite-ground image pairs.
We draw inspiration from the density field representation used in volumetric neural rendering and propose a new approach, called Sat2Density.
Our method utilizes the properties of ground-view panoramas for the sky and non-sky regions to learn faithful density fields of 3D scenes in a geometric perspective.
arXiv Detail & Related papers (2023-03-26T10:15:33Z) - Satellite Image Based Cross-view Localization for Autonomous Vehicle [59.72040418584396]
This paper shows that by using an off-the-shelf high-definition satellite image as a ready-to-use map, we are able to achieve cross-view vehicle localization up to a satisfactory accuracy.
Our method is validated on KITTI and Ford Multi-AV Seasonal datasets as ground view and Google Maps as the satellite view.
arXiv Detail & Related papers (2022-07-27T13:16:39Z) - Geo-Localization via Ground-to-Satellite Cross-View Image Retrieval [25.93015219830576]
Given a ground-view image of a landmark, we aim to achieve cross-view geo-localization by searching out its corresponding satellite-view images.
We take advantage of drone-view information as a bridge between ground-view and satellite-view domains.
arXiv Detail & Related papers (2022-05-22T17:35:13Z) - Cross-View Panorama Image Synthesis [68.35351563852335]
PanoGAN is a novel adversarial feedback GAN framework named.
PanoGAN enables high-quality panorama image generation with more convincing details than state-of-the-art approaches.
arXiv Detail & Related papers (2022-03-22T15:59:44Z) - Coming Down to Earth: Satellite-to-Street View Synthesis for
Geo-Localization [9.333087475006003]
Cross-view image based geo-localization is notoriously challenging due to drastic viewpoint and appearance differences between the two domains.
We show that we can address this discrepancy explicitly by learning to synthesize realistic street views from satellite inputs.
We propose a novel multi-task architecture in which image synthesis and retrieval are considered jointly.
arXiv Detail & Related papers (2021-03-11T17:40:59Z) - Street-view Panoramic Video Synthesis from a Single Satellite Image [92.26826861266784]
We present a novel method for synthesizing both temporally and geometrically consistent street-view panoramic video.
Existing cross-view synthesis approaches focus more on images, while video synthesis in such a case has not yet received enough attention.
arXiv Detail & Related papers (2020-12-11T20:22:38Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.