Generative methods for Urban design and rapid solution space exploration
- URL: http://arxiv.org/abs/2212.06783v1
- Date: Tue, 13 Dec 2022 17:58:02 GMT
- Title: Generative methods for Urban design and rapid solution space exploration
- Authors: Yue Sun, Timur Dogan
- Abstract summary: This research introduces an implementation of a tensor-field-based generative urban modeling toolkit.
Our method encodes contextual constraints such as waterfront edges, terrain, view-axis, existing streets, landmarks, and non-geometric design inputs.
This allows users to generate many, diverse urban fabric configurations that resemble real-world cities with very few model inputs.
- Score: 13.222198221605701
- License: http://creativecommons.org/publicdomain/zero/1.0/
- Abstract: Rapid population growth and climate change drive urban renewal and
urbanization at massive scales. New computational methods are needed to better
support urban designers in developing sustainable, resilient, and livable urban
environments. Urban design space exploration and multi-objective optimization
of masterplans can be used to expedite planning while achieving better design
outcomes by incorporating generative parametric modeling considering different
stakeholder requirements and simulation-based performance feedback. However, a
lack of generalizable and integrative methods for urban form generation that
can be coupled with simulation and various design performance analysis
constrain the extensibility of workflows. This research introduces an
implementation of a tensor-field-based generative urban modeling toolkit that
facilitates rapid design space exploration and multi-objective optimization by
integrating with Rhino/Grasshopper ecosystem and its urban analysis and
environmental performance simulation tools. Our tensor-field modeling method
provides users with a generalized way to encode contextual constraints such as
waterfront edges, terrain, view-axis, existing streets, landmarks, and
non-geometric design inputs such as network directionality, desired densities
of streets, amenities, buildings, and people as forces that modelers can weigh.
This allows users to generate many, diverse urban fabric configurations that
resemble real-world cities with very few model inputs. We present a case study
to demonstrate the proposed framework's flexibility and applicability and show
how modelers can identify design and environmental performance synergies that
would be hard to find otherwise
Related papers
- CityX: Controllable Procedural Content Generation for Unbounded 3D Cities [55.737060358043536]
We propose a novel multi-modal controllable procedural content generation method, named CityX.
It enhances realistic, unbounded 3D city generation guided by multiple layout conditions, including OSM, semantic maps, and satellite images.
Through this effective framework, CityX shows the potential to build an innovative ecosystem for 3D scene generation.
arXiv Detail & Related papers (2024-07-24T18:05:13Z) - Leveraging Generative AI for Urban Digital Twins: A Scoping Review on the Autonomous Generation of Urban Data, Scenarios, Designs, and 3D City Models for Smart City Advancement [7.334114326621768]
Generative Artificial Intelligence (AI) models have demonstrated their unique values in data and code generation.
The survey starts with the introduction of popular generative AI models with their application areas, followed by a review of the existing urban science applications.
Based on the review, this survey discusses potential opportunities and technical strategies that integrate generative AI models into the next-generation urban digital twins.
arXiv Detail & Related papers (2024-05-29T19:23:07Z) - Advancing Transportation Mode Share Analysis with Built Environment: Deep Hybrid Models with Urban Road Network [12.349403667141559]
We propose deep hybrid models (DHM) which directly combine road networks and sociodemographic features as inputs for travel mode share analysis.
In experiments of mode share prediction in Chicago, results demonstrate that DHM can provide valuable spatial insights into the sociodemographic structure.
arXiv Detail & Related papers (2024-05-23T00:59:00Z) - Towards Invariant Time Series Forecasting in Smart Cities [21.697069894721448]
We propose a solution to derive invariant representations for more robust predictions under different urban environments.
Our method can be extended to diverse fields including climate modeling, urban planning, and smart city resource management.
arXiv Detail & Related papers (2024-05-08T21:23:01Z) - Compositional Generative Inverse Design [69.22782875567547]
Inverse design, where we seek to design input variables in order to optimize an underlying objective function, is an important problem.
We show that by instead optimizing over the learned energy function captured by the diffusion model, we can avoid such adversarial examples.
In an N-body interaction task and a challenging 2D multi-airfoil design task, we demonstrate that by composing the learned diffusion model at test time, our method allows us to design initial states and boundary shapes.
arXiv Detail & Related papers (2024-01-24T01:33:39Z) - Unified Data Management and Comprehensive Performance Evaluation for
Urban Spatial-Temporal Prediction [Experiment, Analysis & Benchmark] [78.05103666987655]
This work addresses challenges in accessing and utilizing diverse urban spatial-temporal datasets.
We introduceatomic files, a unified storage format designed for urban spatial-temporal big data, and validate its effectiveness on 40 diverse datasets.
We conduct extensive experiments using diverse models and datasets, establishing a performance leaderboard and identifying promising research directions.
arXiv Detail & Related papers (2023-08-24T16:20:00Z) - Deep Learning for Spatiotemporal Modeling of Urbanization [21.677957140614556]
Urbanization has a strong impact on the health and wellbeing of populations across the world.
Many spatial models have been developed using machine learning and numerical modeling techniques.
Here we explore the capacity of deep spatial learning for the predictive modeling of urbanization.
arXiv Detail & Related papers (2021-12-17T18:27:52Z) - Dynamically Grown Generative Adversarial Networks [111.43128389995341]
We propose a method to dynamically grow a GAN during training, optimizing the network architecture and its parameters together with automation.
The method embeds architecture search techniques as an interleaving step with gradient-based training to periodically seek the optimal architecture-growing strategy for the generator and discriminator.
arXiv Detail & Related papers (2021-06-16T01:25:51Z) - GANs for Urban Design [0.0]
The topic investigated in this paper is the application of Generative Adversarial Networks to the design of an urban block.
The research presents a flexible model able to adapt to the morphological characteristics of a city.
arXiv Detail & Related papers (2021-05-04T19:50:24Z) - Methodological Foundation of a Numerical Taxonomy of Urban Form [62.997667081978825]
We present a method for numerical taxonomy of urban form derived from biological systematics.
We derive homogeneous urban tissue types and, by determining overall morphological similarity between them, generate a hierarchical classification of urban form.
After framing and presenting the method, we test it on two cities - Prague and Amsterdam.
arXiv Detail & Related papers (2021-04-30T12:47:52Z) - Learning to Move with Affordance Maps [57.198806691838364]
The ability to autonomously explore and navigate a physical space is a fundamental requirement for virtually any mobile autonomous agent.
Traditional SLAM-based approaches for exploration and navigation largely focus on leveraging scene geometry.
We show that learned affordance maps can be used to augment traditional approaches for both exploration and navigation, providing significant improvements in performance.
arXiv Detail & Related papers (2020-01-08T04:05:11Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.