CityGen: Infinite and Controllable 3D City Layout Generation
- URL: http://arxiv.org/abs/2312.01508v1
- Date: Sun, 3 Dec 2023 21:16:37 GMT
- Title: CityGen: Infinite and Controllable 3D City Layout Generation
- Authors: Jie Deng, Wenhao Chai, Jianshu Guo, Qixuan Huang, Wenhao Hu, Jenq-Neng
Hwang, Gaoang Wang
- Abstract summary: CityGen is a novel end-to-end framework for infinite, diverse and controllable 3D city layout generation.
CityGen achieves state-of-the-art (SOTA) performance under FID and KID in generating an infinite and controllable 3D city layout.
- Score: 26.1563802843242
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: City layout generation has recently gained significant attention. The goal of
this task is to automatically generate the layout of a city scene, including
elements such as roads, buildings, vegetation, as well as other urban
infrastructures. Previous methods using VAEs or GANs for 3D city layout
generation offer limited diversity and constrained interactivity, only allowing
users to selectively regenerate parts of the layout, which greatly limits
customization. In this paper, we propose CityGen, a novel end-to-end framework
for infinite, diverse and controllable 3D city layout generation.First, we
propose an outpainting pipeline to extend the local layout to an infinite city
layout. Then, we utilize a multi-scale diffusion model to generate diverse and
controllable local semantic layout patches. The extensive experiments show that
CityGen achieves state-of-the-art (SOTA) performance under FID and KID in
generating an infinite and controllable 3D city layout. CityGen demonstrates
promising applicability in fields like smart cities, urban planning, and
digital simulation.
Related papers
- Proc-GS: Procedural Building Generation for City Assembly with 3D Gaussians [65.09942210464747]
Building asset creation is labor-intensive and requires specialized skills to develop design rules.
Recent generative models for building creation often overlook these patterns, leading to low visual fidelity and limited scalability.
By manipulating procedural code, we can streamline this process and generate an infinite variety of buildings.
arXiv Detail & Related papers (2024-12-10T16:45:32Z) - CityX: Controllable Procedural Content Generation for Unbounded 3D Cities [50.10101235281943]
Current generative methods fall short in either diversity, controllability, or fidelity.
In this work, we resort to the procedural content generation (PCG) technique for high-fidelity generation.
We develop a multi-agent framework to transform multi-modal instructions, including OSM, semantic maps, and satellite images, into executable programs.
Our method, named CityX, demonstrates its superiority in creating diverse, controllable, and realistic 3D urban scenes.
arXiv Detail & Related papers (2024-07-24T18:05:13Z) - Streetscapes: Large-scale Consistent Street View Generation Using Autoregressive Video Diffusion [61.929653153389964]
We present a method for generating Streetscapes-long sequences of views through an on-the-fly synthesized city-scale scene.
Our method can scale to much longer-range camera trajectories, spanning several city blocks, while maintaining visual quality and consistency.
arXiv Detail & Related papers (2024-07-18T17:56:30Z) - COHO: Context-Sensitive City-Scale Hierarchical Urban Layout Generation [1.5745692520785073]
We introduce a novel graph-based masked autoencoder (GMAE) for city-scale urban layout generation.
The method encodes attributed buildings, city blocks, communities and cities into a unified graph structure.
Our approach achieves good realism, semantic consistency, and correctness across the heterogeneous urban styles in 330 US cities.
arXiv Detail & Related papers (2024-07-16T00:49:53Z) - CityCraft: A Real Crafter for 3D City Generation [25.7885801163556]
CityCraft is an innovative framework designed to enhance both the diversity and quality of urban scene generation.
Our approach integrates three key stages: initially, a diffusion transformer (DiT) model is deployed to generate diverse and controllable 2D city layouts.
Based on the generated layout and city plan, we utilize the asset retrieval module and Blender for precise asset placement and scene construction.
arXiv Detail & Related papers (2024-06-07T14:49:00Z) - Urban Architect: Steerable 3D Urban Scene Generation with Layout Prior [43.14168074750301]
We introduce a compositional 3D layout representation into text-to-3D paradigm, serving as an additional prior.
It comprises a set of semantic primitives with simple geometric structures and explicit arrangement relationships.
We also present various scene editing demonstrations, showing the powers of steerable urban scene generation.
arXiv Detail & Related papers (2024-04-10T06:41:30Z) - GALA3D: Towards Text-to-3D Complex Scene Generation via Layout-guided Generative Gaussian Splatting [52.150502668874495]
We present GALA3D, generative 3D GAussians with LAyout-guided control, for effective compositional text-to-3D generation.
GALA3D is a user-friendly, end-to-end framework for state-of-the-art scene-level 3D content generation and controllable editing.
arXiv Detail & Related papers (2024-02-11T13:40:08Z) - CityDreamer: Compositional Generative Model of Unbounded 3D Cities [44.203932215464214]
CityDreamer is a compositional generative model designed specifically for unbounded 3D cities.
We adopt the bird's eye view scene representation and employ a volumetric render for both instance-oriented and stuff-oriented neural fields.
CityDreamer achieves state-of-the-art performance not only in generating realistic 3D cities but also in localized editing within the generated cities.
arXiv Detail & Related papers (2023-09-01T17:57:02Z) - GlobalMapper: Arbitrary-Shaped Urban Layout Generation [1.5076964620370268]
A building layout consists of a set of buildings in city blocks defined by a network of roads.
We propose a fully automatic approach to building layout generation using graph attention networks.
Our results, including user study, demonstrate superior performance as compared to prior layout generation networks.
arXiv Detail & Related papers (2023-07-19T00:36:05Z) - InfiniCity: Infinite-Scale City Synthesis [101.87428043837242]
We propose a novel framework, InfiniCity, which constructs and renders an unconstrainedly large and 3D-grounded environment from random noises.
An infinite-pixel image synthesis module generates arbitrary-scale 2D maps from the bird's-eye view.
An octree-based voxel completion module lifts the generated 2D map to 3D octrees.
A voxel-based neural rendering module texturizes the voxels and renders 2D images.
arXiv Detail & Related papers (2023-01-23T18:59:59Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.