LibCity: A Unified Library Towards Efficient and Comprehensive Urban
Spatial-Temporal Prediction
- URL: http://arxiv.org/abs/2304.14343v7
- Date: Thu, 7 Mar 2024 16:41:10 GMT
- Title: LibCity: A Unified Library Towards Efficient and Comprehensive Urban
Spatial-Temporal Prediction
- Authors: Jiawei Jiang, Chengkai Han, Wenjun Jiang, Wayne Xin Zhao, Jingyuan
Wang
- Abstract summary: There are limitations in the existing field, including open-source data being in various formats and difficult to use.
We propose LibCity, an open-source library that offers researchers a credible experimental tool and a convenient development framework.
- Score: 74.08181247675095
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: As deep learning technology advances and more urban spatial-temporal data
accumulates, an increasing number of deep learning models are being proposed to
solve urban spatial-temporal prediction problems. However, there are
limitations in the existing field, including open-source data being in various
formats and difficult to use, few papers making their code and data openly
available, and open-source models often using different frameworks and
platforms, making comparisons challenging. A standardized framework is urgently
needed to implement and evaluate these methods. To address these issues, we
propose LibCity, an open-source library that offers researchers a credible
experimental tool and a convenient development framework. In this library, we
have reproduced 65 spatial-temporal prediction models and collected 55
spatial-temporal datasets, allowing researchers to conduct comprehensive
experiments conveniently. By enabling fair model comparisons, designing a
unified data storage format, and simplifying the process of developing new
models, LibCity is poised to make significant contributions to the
spatial-temporal prediction field.
Related papers
- XXLTraffic: Expanding and Extremely Long Traffic Dataset for Ultra-Dynamic Forecasting Challenges [3.7509821052818118]
XXLTraffic is the largest available public traffic dataset with the longest timespan and increasing number of sensor nodes.
Our dataset supplements existing-temporal data resources and leads to new research directions in this domain.
arXiv Detail & Related papers (2024-06-18T15:06:22Z) - A Survey of Generative Techniques for Spatial-Temporal Data Mining [93.55501980723974]
This paper focuses on the integration of generative techniques into spatial-temporal data mining.
The paper provides a comprehensive analysis of generative technique-based spatial-temporal methods.
It also introduces a standardized framework specifically designed for the spatial-temporal data mining pipeline.
arXiv Detail & Related papers (2024-05-15T12:07:43Z) - UrbanGPT: Spatio-Temporal Large Language Models [34.79169613947957]
We present the UrbanPT, which seamlessly integrates atemporal-temporal encoder with instruction-tuning paradigm.
We conduct extensive experiments on various public datasets, covering differenttemporal prediction tasks.
The results consistently demonstrate that our UrbanPT, with its carefully designed architecture, consistently outperforms state-of-the-art baselines.
arXiv Detail & Related papers (2024-02-25T12:37:29Z) - Spatial-temporal Forecasting for Regions without Observations [13.805203053973772]
We study spatial-temporal forecasting for a region of interest without any historical observations.
We propose a model named STSM for the task.
Our key insight is to learn from the locations that resemble those in the region of interest.
arXiv Detail & Related papers (2024-01-19T06:26:05Z) - Large Models for Time Series and Spatio-Temporal Data: A Survey and
Outlook [95.32949323258251]
Temporal data, notably time series andtemporal-temporal data, are prevalent in real-world applications.
Recent advances in large language and other foundational models have spurred increased use in time series andtemporal data mining.
arXiv Detail & Related papers (2023-10-16T09:06:00Z) - Pushing the Limits of Pre-training for Time Series Forecasting in the
CloudOps Domain [54.67888148566323]
We introduce three large-scale time series forecasting datasets from the cloud operations domain.
We show it is a strong zero-shot baseline and benefits from further scaling, both in model and dataset size.
Accompanying these datasets and results is a suite of comprehensive benchmark results comparing classical and deep learning baselines to our pre-trained method.
arXiv Detail & Related papers (2023-10-08T08:09:51Z) - Unified Data Management and Comprehensive Performance Evaluation for
Urban Spatial-Temporal Prediction [Experiment, Analysis & Benchmark] [78.05103666987655]
This work addresses challenges in accessing and utilizing diverse urban spatial-temporal datasets.
We introduceatomic files, a unified storage format designed for urban spatial-temporal big data, and validate its effectiveness on 40 diverse datasets.
We conduct extensive experiments using diverse models and datasets, establishing a performance leaderboard and identifying promising research directions.
arXiv Detail & Related papers (2023-08-24T16:20:00Z) - OpenSTL: A Comprehensive Benchmark of Spatio-Temporal Predictive
Learning [67.07363529640784]
We propose OpenSTL to categorize prevalent approaches into recurrent-based and recurrent-free models.
We conduct standard evaluations on datasets across various domains, including synthetic moving object trajectory, human motion, driving scenes, traffic flow and forecasting weather.
We find that recurrent-free models achieve a good balance between efficiency and performance than recurrent models.
arXiv Detail & Related papers (2023-06-20T03:02:14Z) - Survey of Federated Learning Models for Spatial-Temporal Mobility
Applications [9.896508514316812]
Federated learning (FL) can serve as an ideal candidate for training spatial temporal models.
There are unique challenges involved with transitioning existing spatial temporal models to decentralized learning.
arXiv Detail & Related papers (2023-05-09T08:26:48Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.