Reimagining City Configuration: Automated Urban Planning via Adversarial
Learning
- URL: http://arxiv.org/abs/2008.09912v2
- Date: Thu, 7 Jan 2021 16:52:19 GMT
- Title: Reimagining City Configuration: Automated Urban Planning via Adversarial
Learning
- Authors: Dongjie Wang, Yanjie Fu, Pengyang Wang, Bo Huang, Chang-Tien Lu
- Abstract summary: Urban planning refers to the efforts of designing land-use configurations.
Recent advance of deep learning motivates us to ask: can machines learn at a human capability to automatically and quickly calculate land-use configuration.
- Score: 28.930624100994514
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Urban planning refers to the efforts of designing land-use configurations.
Effective urban planning can help to mitigate the operational and social
vulnerability of a urban system, such as high tax, crimes, traffic congestion
and accidents, pollution, depression, and anxiety. Due to the high complexity
of urban systems, such tasks are mostly completed by professional planners.
But, human planners take longer time. The recent advance of deep learning
motivates us to ask: can machines learn at a human capability to automatically
and quickly calculate land-use configuration, so human planners can finally
adjust machine-generated plans for specific needs? To this end, we formulate
the automated urban planning problem into a task of learning to configure
land-uses, given the surrounding spatial contexts. To set up the task, we
define a land-use configuration as a longitude-latitude-channel tensor, where
each channel is a category of POIs and the value of an entry is the number of
POIs. The objective is then to propose an adversarial learning framework that
can automatically generate such tensor for an unplanned area. In particular, we
first characterize the contexts of surrounding areas of an unplanned area by
learning representations from spatial graphs using geographic and human
mobility data. Second, we combine each unplanned area and its surrounding
context representation as a tuple, and categorize all the tuples into positive
(well-planned areas) and negative samples (poorly-planned areas). Third, we
develop an adversarial land-use configuration approach, where the surrounding
context representation is fed into a generator to generate a land-use
configuration, and a discriminator learns to distinguish among positive and
negative samples.
Related papers
- Neural MP: A Generalist Neural Motion Planner [75.82675575009077]
We seek to do the same by applying data-driven learning at scale to the problem of motion planning.
Our approach builds a large number of complex scenes in simulation, collects expert data from a motion planner, then distills it into a reactive generalist policy.
We perform a thorough evaluation of our method on 64 motion planning tasks across four diverse environments.
arXiv Detail & Related papers (2024-09-09T17:59:45Z) - Dual-stage Flows-based Generative Modeling for Traceable Urban Planning [33.03616838528995]
We propose a novel generative framework based on normalizing flows, namely Dual-stage Urban Flows framework.
We employ an Information Fusion Module to capture the relationship among functional zones and fuse the information of different aspects.
Our framework can outperform compared to other generative models for the urban planning task.
arXiv Detail & Related papers (2023-10-03T21:49:49Z) - Embodied Task Planning with Large Language Models [86.63533340293361]
We propose a TAsk Planing Agent (TaPA) in embodied tasks for grounded planning with physical scene constraint.
During inference, we discover the objects in the scene by extending open-vocabulary object detectors to multi-view RGB images collected in different achievable locations.
Experimental results show that the generated plan from our TaPA framework can achieve higher success rate than LLaVA and GPT-3.5 by a sizable margin.
arXiv Detail & Related papers (2023-07-04T17:58:25Z) - Human-instructed Deep Hierarchical Generative Learning for Automated
Urban Planning [57.91323079939641]
We develop a novel human-instructed deep hierarchical generative model to generate optimal urban plans.
The first stage is to label the grids of a target area with latent functionalities to discover functional zones.
The second stage is to perceive the planning requirements to form urban functionality projections.
The third stage is to leverage multi-attentions to model the zone-zone peer dependencies of the functionality projections to generate grid-level land-use configurations.
arXiv Detail & Related papers (2022-12-01T23:06:41Z) - Automated Urban Planning aware Spatial Hierarchies and Human
Instructions [33.06221365923015]
We propose a novel, deep, human-instructed urban planner based on generative adversarial networks (GANs)
GANs build urban functional zones based on information from human instructions and surrounding contexts.
We conduct extensive experiments to validate the efficacy of our work.
arXiv Detail & Related papers (2022-09-26T20:37:02Z) - Automated Urban Planning for Reimagining City Configuration via
Adversarial Learning: Quantification, Generation, and Evaluation [30.48671788567521]
Urban planning refers to the efforts of designing land-use configurations given a region.
To obtain effective urban plans, urban experts have to spend much time and effort analyzing sophisticated planning constraints.
We formulate the automated urban planning problem into a task of deep generative learning.
arXiv Detail & Related papers (2021-12-26T00:59:35Z) - Differentiable Spatial Planning using Transformers [87.90709874369192]
We propose Spatial Planning Transformers (SPT), which given an obstacle map learns to generate actions by planning over long-range spatial dependencies.
In the setting where the ground truth map is not known to the agent, we leverage pre-trained SPTs in an end-to-end framework.
SPTs outperform prior state-of-the-art differentiable planners across all the setups for both manipulation and navigation tasks.
arXiv Detail & Related papers (2021-12-02T06:48:16Z) - Deep Human-guided Conditional Variational Generative Modeling for
Automated Urban Planning [30.614010268762115]
Urban planning designs land-use configurations and can benefit building livable, sustainable, safe communities.
Inspired by image generation, deep urban planning aims to leverage deep learning to generate land-use configurations.
This paper studies a novel deep human guided urban planning method to jointly solve the above challenges.
arXiv Detail & Related papers (2021-10-12T15:45:38Z) - Methodological Foundation of a Numerical Taxonomy of Urban Form [62.997667081978825]
We present a method for numerical taxonomy of urban form derived from biological systematics.
We derive homogeneous urban tissue types and, by determining overall morphological similarity between them, generate a hierarchical classification of urban form.
After framing and presenting the method, we test it on two cities - Prague and Amsterdam.
arXiv Detail & Related papers (2021-04-30T12:47:52Z) - Plan2Vec: Unsupervised Representation Learning by Latent Plans [106.37274654231659]
We introduce plan2vec, an unsupervised representation learning approach that is inspired by reinforcement learning.
Plan2vec constructs a weighted graph on an image dataset using near-neighbor distances, and then extrapolates this local metric to a global embedding by distilling path-integral over planned path.
We demonstrate the effectiveness of plan2vec on one simulated and two challenging real-world image datasets.
arXiv Detail & Related papers (2020-05-07T17:52:23Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.