CrowdWeb: A Visualization Tool for Mobility Patterns in Smart Cities
- URL: http://arxiv.org/abs/2305.12935v1
- Date: Mon, 22 May 2023 11:30:00 GMT
- Title: CrowdWeb: A Visualization Tool for Mobility Patterns in Smart Cities
- Authors: Yisheng Alison Zheng, Abdallah Lakhdari, Amani Abusafia, Shing Tai
Tony Lui, Athman Bouguettaya
- Abstract summary: The accuracy of current mobility prediction models is less than 25%.
We propose a web platform to visualize human mobility patterns.
We extend the platform to visualize the mobility of multiple users from a city-scale perspective.
- Score: 0.39373541926236766
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Human mobility patterns refer to the regularities and trends in the way
people move, travel, or navigate through different geographical locations over
time. Detecting human mobility patterns is essential for a variety of
applications, including smart cities, transportation management, and disaster
response. The accuracy of current mobility prediction models is less than 25%.
The low accuracy is mainly due to the fluid nature of human movement.
Typically, humans do not adhere to rigid patterns in their daily activities,
making it difficult to identify hidden regularities in their data. To address
this issue, we proposed a web platform to visualize human mobility patterns by
abstracting the locations into a set of places to detect more realistic
patterns. However, the platform was initially designed to detect individual
mobility patterns, making it unsuitable for representing the crowd in a smart
city scale. Therefore, we extend the platform to visualize the mobility of
multiple users from a city-scale perspective. Our platform allows users to
visualize a graph of visited places based on their historical records using a
modified PrefixSpan approach. Additionally, the platform synchronizes,
aggregates, and displays crowd mobility patterns across various time intervals
within a smart city. We showcase our platform using a real dataset.
Related papers
- HUMOS: Human Motion Model Conditioned on Body Shape [54.20419874234214]
We introduce a new approach to develop a generative motion model based on body shape.
We show that it's possible to train this model using unpaired data.
The resulting model generates diverse, physically plausible, and dynamically stable human motions.
arXiv Detail & Related papers (2024-09-05T23:50:57Z) - Deep Activity Model: A Generative Approach for Human Mobility Pattern Synthesis [11.90100976089832]
We develop a novel generative deep learning approach for human mobility modeling and synthesis.
It incorporates both activity patterns and location trajectories using open-source data.
The model can be fine-tuned with local data, allowing it to adapt to accurately represent mobility patterns across diverse regions.
arXiv Detail & Related papers (2024-05-24T02:04:10Z) - Human Mobility in the Metaverse [0.03072340427031969]
We find that despite the absence of commuting costs, an individuals inclination to explore new locations diminishes over time.
We also find a lack of correlation between land prices and visitation, a deviation from the patterns characterizing the physical world.
Our ability to predict the characteristics of the emerging meta mobility network implies that the laws governing human mobility are rooted in fundamental patterns of human dynamics.
arXiv Detail & Related papers (2024-04-03T21:26:40Z) - Social-Transmotion: Promptable Human Trajectory Prediction [65.80068316170613]
Social-Transmotion is a generic Transformer-based model that exploits diverse and numerous visual cues to predict human behavior.
Our approach is validated on multiple datasets, including JTA, JRDB, Pedestrians and Cyclists in Road Traffic, and ETH-UCY.
arXiv Detail & Related papers (2023-12-26T18:56:49Z) - Rethinking Urban Mobility Prediction: A Super-Multivariate Time Series
Forecasting Approach [71.67506068703314]
Long-term urban mobility predictions play a crucial role in the effective management of urban facilities and services.
Traditionally, urban mobility data has been structured as videos, treating longitude and latitude as fundamental pixels.
In our research, we introduce a fresh perspective on urban mobility prediction.
Instead of oversimplifying urban mobility data as traditional video data, we regard it as a complex time series.
arXiv Detail & Related papers (2023-12-04T07:39:05Z) - IMAP: Individual huMAn mobility Patterns visualizing platform [0.39373541926236766]
Existing models' accuracy in predicting users' mobility patterns is less than 25%.
We propose a novel perspective to study and analyze human mobility patterns and capture their flexibility.
Our platform enables users to visualize a graph of the places they visited based on their history records.
arXiv Detail & Related papers (2022-09-08T07:43:54Z) - Mobility signatures: a tool for characterizing cities using intercity
mobility flows [1.1602089225841632]
We introduce the mobility signature as a tool for understanding how a city is embedded in the wider mobility network.
We demonstrate the potential of the mobility signature approach through two applications that build on mobile-phone-based data from Finland.
arXiv Detail & Related papers (2021-12-03T08:53:58Z) - Learning Perceptual Locomotion on Uneven Terrains using Sparse Visual
Observations [75.60524561611008]
This work aims to exploit the use of sparse visual observations to achieve perceptual locomotion over a range of commonly seen bumps, ramps, and stairs in human-centred environments.
We first formulate the selection of minimal visual input that can represent the uneven surfaces of interest, and propose a learning framework that integrates such exteroceptive and proprioceptive data.
We validate the learned policy in tasks that require omnidirectional walking over flat ground and forward locomotion over terrains with obstacles, showing a high success rate.
arXiv Detail & Related papers (2021-09-28T20:25:10Z) - A Data-Driven Analytical Framework of Estimating Multimodal Travel
Demand Patterns using Mobile Device Location Data [5.902556437760098]
This paper presents a data-driven analytical framework to extract multimodal travel demand patterns from smartphone location data.
A jointly trained single-layer model and deep neural network for travel mode imputation is developed.
The framework also incorporates the multimodal transportation network in order to evaluate the closeness of trip routes to the nearby rail, metro, highway and bus lines.
arXiv Detail & Related papers (2020-12-08T22:49:44Z) - Hidden Footprints: Learning Contextual Walkability from 3D Human Trails [70.01257397390361]
Current datasets only tell you where people are, not where they could be.
We first augment the set of valid, labeled walkable regions by propagating person observations between images, utilizing 3D information to create what we call hidden footprints.
We devise a training strategy designed for such sparse labels, combining a class-balanced classification loss with a contextual adversarial loss.
arXiv Detail & Related papers (2020-08-19T23:19:08Z) - Learning to Move with Affordance Maps [57.198806691838364]
The ability to autonomously explore and navigate a physical space is a fundamental requirement for virtually any mobile autonomous agent.
Traditional SLAM-based approaches for exploration and navigation largely focus on leveraging scene geometry.
We show that learned affordance maps can be used to augment traditional approaches for both exploration and navigation, providing significant improvements in performance.
arXiv Detail & Related papers (2020-01-08T04:05:11Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.