Modeling and Simulating Agent-Based City Migration Using Conway's Game of Life
- URL: http://arxiv.org/abs/2412.20691v1
- Date: Mon, 30 Dec 2024 03:59:30 GMT
- Title: Modeling and Simulating Agent-Based City Migration Using Conway's Game of Life
- Authors: Bruce Deng, Mayank Kejriwal,
- Abstract summary: We propose and implement a novel GoL-based framework to simulate urban migration dynamics.
Using a grid-within-a-grid approach, our approach encodes probabilistic tendencies for out-migration due to densification and sparsification.
Our framework offers a versatile and computationally efficient tool for studying urban migration patterns, contributing to the broader application of ABMs in computational urban social science.
- Score: 2.9312156642007294
- License:
- Abstract: Agent-based modeling (ABM) has become a cornerstone of complexity science, enabling the study of heterogeneous agents interacting within dynamic environments. Among ABM frameworks, John Conway's Game of Life (GoL) stands out for its simplicity and ability to generate emergent macroscopic patterns from basic microscopic rules. In this paper, we propose and implement a novel GoL-based framework to simulate urban migration dynamics. Using a grid-within-a-grid approach, our approach encodes probabilistic tendencies for out-migration due to densification and sparsification, simulating the evolution of population centers. By initializing GoL grids with different distributions and parameterizing migration preferences, we explore how urban structures emerge and stabilize over time. Through a series of experiments, we demonstrate that even with simple rules, this framework shows promise for understanding emergent urban phenomena, providing insights into city growth and structure. Methodologically, our framework offers a versatile and computationally efficient tool for studying urban migration patterns, contributing to the broader application of ABMs in computational urban social science.
Related papers
- Collaborative Imputation of Urban Time Series through Cross-city Meta-learning [54.438991949772145]
We propose a novel collaborative imputation paradigm leveraging meta-learned implicit neural representations (INRs)
We then introduce a cross-city collaborative learning scheme through model-agnostic meta learning.
Experiments on a diverse urban dataset from 20 global cities demonstrate our model's superior imputation performance and generalizability.
arXiv Detail & Related papers (2025-01-20T07:12:40Z) - Planning, Living and Judging: A Multi-agent LLM-based Framework for Cyclical Urban Planning [5.9423583597394325]
Urban regeneration presents significant challenges within the context of urbanization.
We propose Cyclical Urban Planning (CUP), a new paradigm that continuously generates, evaluates, and refines urban plans in a closed-loop.
Experiments on the real-world dataset demonstrate the effectiveness of our framework as a continuous and adaptive planning process.
arXiv Detail & Related papers (2024-12-29T15:43:25Z) - GenSim: A General Social Simulation Platform with Large Language Model based Agents [111.00666003559324]
We propose a novel large language model (LLMs)-based simulation platform called textitGenSim.
Our platform supports one hundred thousand agents to better simulate large-scale populations in real-world contexts.
To our knowledge, GenSim represents an initial step toward a general, large-scale, and correctable social simulation platform.
arXiv Detail & Related papers (2024-10-06T05:02:23Z) - CityX: Controllable Procedural Content Generation for Unbounded 3D Cities [50.10101235281943]
Current generative methods fall short in either diversity, controllability, or fidelity.
In this work, we resort to the procedural content generation (PCG) technique for high-fidelity generation.
We develop a multi-agent framework to transform multi-modal instructions, including OSM, semantic maps, and satellite images, into executable programs.
Our method, named CityX, demonstrates its superiority in creating diverse, controllable, and realistic 3D urban scenes.
arXiv Detail & Related papers (2024-07-24T18:05:13Z) - LangSuitE: Planning, Controlling and Interacting with Large Language Models in Embodied Text Environments [70.91258869156353]
We introduce LangSuitE, a versatile and simulation-free testbed featuring 6 representative embodied tasks in textual embodied worlds.
Compared with previous LLM-based testbeds, LangSuitE offers adaptability to diverse environments without multiple simulation engines.
We devise a novel chain-of-thought (CoT) schema, EmMem, which summarizes embodied states w.r.t. history information.
arXiv Detail & Related papers (2024-06-24T03:36:29Z) - Generative methods for Urban design and rapid solution space exploration [13.222198221605701]
This research introduces an implementation of a tensor-field-based generative urban modeling toolkit.
Our method encodes contextual constraints such as waterfront edges, terrain, view-axis, existing streets, landmarks, and non-geometric design inputs.
This allows users to generate many, diverse urban fabric configurations that resemble real-world cities with very few model inputs.
arXiv Detail & Related papers (2022-12-13T17:58:02Z) - MetroGAN: Simulating Urban Morphology with Generative Adversarial
Network [10.504296192020497]
We propose a GAN framework with geographical knowledge, namely Metropolitan GAN (MetroGAN) for urban morphology simulation.
Results show that MetroGAN outperforms the state-of-the-art urban simulation methods by over 20% in all metrics.
arXiv Detail & Related papers (2022-07-06T11:02:24Z) - Methodological Foundation of a Numerical Taxonomy of Urban Form [62.997667081978825]
We present a method for numerical taxonomy of urban form derived from biological systematics.
We derive homogeneous urban tissue types and, by determining overall morphological similarity between them, generate a hierarchical classification of urban form.
After framing and presenting the method, we test it on two cities - Prague and Amsterdam.
arXiv Detail & Related papers (2021-04-30T12:47:52Z) - Learning to Move with Affordance Maps [57.198806691838364]
The ability to autonomously explore and navigate a physical space is a fundamental requirement for virtually any mobile autonomous agent.
Traditional SLAM-based approaches for exploration and navigation largely focus on leveraging scene geometry.
We show that learned affordance maps can be used to augment traditional approaches for both exploration and navigation, providing significant improvements in performance.
arXiv Detail & Related papers (2020-01-08T04:05:11Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.