Deep Human-guided Conditional Variational Generative Modeling for
Automated Urban Planning
- URL: http://arxiv.org/abs/2110.07717v1
- Date: Tue, 12 Oct 2021 15:45:38 GMT
- Title: Deep Human-guided Conditional Variational Generative Modeling for
Automated Urban Planning
- Authors: Dongjie Wang, Kunpeng Liu, Pauline Johnson, Leilei Sun, Bowen Du,
Yanjie Fu
- Abstract summary: Urban planning designs land-use configurations and can benefit building livable, sustainable, safe communities.
Inspired by image generation, deep urban planning aims to leverage deep learning to generate land-use configurations.
This paper studies a novel deep human guided urban planning method to jointly solve the above challenges.
- Score: 30.614010268762115
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Urban planning designs land-use configurations and can benefit building
livable, sustainable, safe communities. Inspired by image generation, deep
urban planning aims to leverage deep learning to generate land-use
configurations. However, urban planning is a complex process. Existing studies
usually ignore the need of personalized human guidance in planning, and spatial
hierarchical structure in planning generation. Moreover, the lack of
large-scale land-use configuration samples poses a data sparsity challenge.
This paper studies a novel deep human guided urban planning method to jointly
solve the above challenges. Specifically, we formulate the problem into a deep
conditional variational autoencoder based framework. In this framework, we
exploit the deep encoder-decoder design to generate land-use configurations. To
capture the spatial hierarchy structure of land uses, we enforce the decoder to
generate both the coarse-grained layer of functional zones, and the
fine-grained layer of POI distributions. To integrate human guidance, we allow
humans to describe what they need as texts and use these texts as a model
condition input. To mitigate training data sparsity and improve model
robustness, we introduce a variational Gaussian embedding mechanism. It not
just allows us to better approximate the embedding space distribution of
training data and sample a larger population to overcome sparsity, but also
adds more probabilistic randomness into the urban planning generation to
improve embedding diversity so as to improve robustness. Finally, we present
extensive experiments to validate the enhanced performances of our method.
Related papers
- StreetSurfGS: Scalable Urban Street Surface Reconstruction with Planar-based Gaussian Splatting [85.67616000086232]
StreetSurfGS is first method to employ Gaussian Splatting specifically tailored for scalable urban street scene surface reconstruction.
StreetSurfGS utilizes a planar-based octree representation and segmented training to reduce memory costs, accommodate unique camera characteristics, and ensure scalability.
To address sparse views and multi-scale challenges, we use a dual-step matching strategy that leverages adjacent and long-term information.
arXiv Detail & Related papers (2024-10-06T04:21:59Z) - Simple Hierarchical Planning with Diffusion [54.48129192534653]
Diffusion-based generative methods have proven effective in modeling trajectories with offline datasets.
We introduce the Hierarchical diffuser, a fast, yet surprisingly effective planning method combining the advantages of hierarchical and diffusion-based planning.
Our model adopts a "jumpy" planning strategy at the higher level, which allows it to have a larger receptive field but at a lower computational cost.
arXiv Detail & Related papers (2024-01-05T05:28:40Z) - Dual-stage Flows-based Generative Modeling for Traceable Urban Planning [33.03616838528995]
We propose a novel generative framework based on normalizing flows, namely Dual-stage Urban Flows framework.
We employ an Information Fusion Module to capture the relationship among functional zones and fuse the information of different aspects.
Our framework can outperform compared to other generative models for the urban planning task.
arXiv Detail & Related papers (2023-10-03T21:49:49Z) - Unified Data Management and Comprehensive Performance Evaluation for
Urban Spatial-Temporal Prediction [Experiment, Analysis & Benchmark] [78.05103666987655]
This work addresses challenges in accessing and utilizing diverse urban spatial-temporal datasets.
We introduceatomic files, a unified storage format designed for urban spatial-temporal big data, and validate its effectiveness on 40 diverse datasets.
We conduct extensive experiments using diverse models and datasets, establishing a performance leaderboard and identifying promising research directions.
arXiv Detail & Related papers (2023-08-24T16:20:00Z) - Human-instructed Deep Hierarchical Generative Learning for Automated
Urban Planning [57.91323079939641]
We develop a novel human-instructed deep hierarchical generative model to generate optimal urban plans.
The first stage is to label the grids of a target area with latent functionalities to discover functional zones.
The second stage is to perceive the planning requirements to form urban functionality projections.
The third stage is to leverage multi-attentions to model the zone-zone peer dependencies of the functionality projections to generate grid-level land-use configurations.
arXiv Detail & Related papers (2022-12-01T23:06:41Z) - Automated Urban Planning aware Spatial Hierarchies and Human
Instructions [33.06221365923015]
We propose a novel, deep, human-instructed urban planner based on generative adversarial networks (GANs)
GANs build urban functional zones based on information from human instructions and surrounding contexts.
We conduct extensive experiments to validate the efficacy of our work.
arXiv Detail & Related papers (2022-09-26T20:37:02Z) - Temporal Predictive Coding For Model-Based Planning In Latent Space [80.99554006174093]
We present an information-theoretic approach that employs temporal predictive coding to encode elements in the environment that can be predicted across time.
We evaluate our model on a challenging modification of standard DMControl tasks where the background is replaced with natural videos that contain complex but irrelevant information to the planning task.
arXiv Detail & Related papers (2021-06-14T04:31:15Z) - Methodological Foundation of a Numerical Taxonomy of Urban Form [62.997667081978825]
We present a method for numerical taxonomy of urban form derived from biological systematics.
We derive homogeneous urban tissue types and, by determining overall morphological similarity between them, generate a hierarchical classification of urban form.
After framing and presenting the method, we test it on two cities - Prague and Amsterdam.
arXiv Detail & Related papers (2021-04-30T12:47:52Z) - Reimagining City Configuration: Automated Urban Planning via Adversarial
Learning [28.930624100994514]
Urban planning refers to the efforts of designing land-use configurations.
Recent advance of deep learning motivates us to ask: can machines learn at a human capability to automatically and quickly calculate land-use configuration.
arXiv Detail & Related papers (2020-08-22T21:15:39Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.