PTR: A Pre-trained Language Model for Trajectory Recovery
- URL: http://arxiv.org/abs/2410.14281v1
- Date: Fri, 18 Oct 2024 08:38:12 GMT
- Title: PTR: A Pre-trained Language Model for Trajectory Recovery
- Authors: Tonglong Wei, Yan Lin, Youfang Lin, Shengnan Guo, Jilin Hu, Gao Cong, Huaiyu Wan,
- Abstract summary: We propose a framework called PTR to mitigate the issue of limited dense trajectory data.
PTR incorporates an explicit trajectory prompt and is trained on datasets with multiple sampling intervals.
We also present a trajectory embedder that encodes trajectory points and transforms the embeddings of both observed and missing points into a format comprehensible to PLMs.
- Score: 31.08861372332931
- License:
- Abstract: Spatiotemporal trajectory data is vital for web-of-things services and is extensively collected and analyzed by web-based hardware and platforms. However, issues such as service interruptions and network instability often lead to sparsely recorded trajectories, resulting in a loss of detailed movement data. As a result, recovering these trajectories to restore missing information becomes essential. Despite progress, several challenges remain unresolved. First, the lack of large-scale dense trajectory data hampers the performance of existing deep learning methods, which rely heavily on abundant data for supervised training. Second, current methods struggle to generalize across sparse trajectories with varying sampling intervals, necessitating separate re-training for each interval and increasing computational costs. Third, external factors crucial for the recovery of missing points are not fully incorporated. To address these challenges, we propose a framework called PTR. This framework mitigates the issue of limited dense trajectory data by leveraging the capabilities of pre-trained language models (PLMs). PTR incorporates an explicit trajectory prompt and is trained on datasets with multiple sampling intervals, enabling it to generalize effectively across different intervals in sparse trajectories. To capture external factors, we introduce an implicit trajectory prompt that models road conditions, providing richer information for recovering missing points. Additionally, we present a trajectory embedder that encodes trajectory points and transforms the embeddings of both observed and missing points into a format comprehensible to PLMs. Experimental results on two public trajectory datasets with three sampling intervals demonstrate the efficacy and scalability of PTR.
Related papers
- Towards Stable and Storage-efficient Dataset Distillation: Matching Convexified Trajectory [53.37473225728298]
The rapid evolution of deep learning and large language models has led to an exponential growth in the demand for training data.
Matching Training Trajectories (MTT) has been a prominent approach, which replicates the training trajectory of an expert network on real data with a synthetic dataset.
We introduce a novel method called Matching Convexified Trajectory (MCT), which aims to provide better guidance for the student trajectory.
arXiv Detail & Related papers (2024-06-28T11:06:46Z) - TraceMesh: Scalable and Streaming Sampling for Distributed Traces [51.08892669409318]
TraceMesh is a scalable and streaming sampler for distributed traces.
It accommodates previously unseen trace features in a unified and streamlined way.
TraceMesh outperforms state-of-the-art methods by a significant margin in both sampling accuracy and efficiency.
arXiv Detail & Related papers (2024-06-11T06:13:58Z) - Deep Learning for Trajectory Data Management and Mining: A Survey and Beyond [58.63558696061679]
Trajectory computing is crucial in various practical applications such as location services, urban traffic, and public safety.
We present a review of development and recent advances in deep learning for trajectory computing (DL4Traj)
Notably, we encapsulate recent advancements in Large Language Models (LLMs) that hold potential to augment trajectory computing.
arXiv Detail & Related papers (2024-03-21T05:57:27Z) - UVTM: Universal Vehicle Trajectory Modeling with ST Feature Domain Generation [34.918489559139715]
Universal Vehicle Trajectory (UVTM) is designed to support different tasks based on incomplete or sparse trajectories.
To handle sparse trajectories effectively, UVTM is pre-trained by reconstructing densely sampled trajectories from sparsely sampled ones.
arXiv Detail & Related papers (2024-02-11T15:49:50Z) - Improving Transferability for Cross-domain Trajectory Prediction via
Neural Stochastic Differential Equation [41.09061877498741]
discrepancies exist among datasets due to external factors and data acquisition strategies.
The proficient performance of models trained on large-scale datasets has limited transferability on other small-size datasets.
We propose a method based on continuous and utilization of Neural Differential Equations (NSDE) for alleviating discrepancies.
The effectiveness of our method is validated against state-of-the-art trajectory prediction models on the popular benchmark datasets: nuScenes, Argoverse, Lyft, INTERACTION, and Open Motion dataset.
arXiv Detail & Related papers (2023-12-26T06:50:29Z) - A Critical Perceptual Pre-trained Model for Complex Trajectory Recovery [27.347708962204713]
This work is dedicated to offering a more robust trajectory recovery for complex trajectories.
We propose a Multi-view Graph and Complexity Aware Transformer (MGCAT) model to encode these semantics in trajectory pre-training.
The results prove that our method learns better representations for trajectory recovery, with 5.22% higher F1-score overall and 8.16% higher F1-score for complex trajectories particularly.
arXiv Detail & Related papers (2023-11-05T12:20:39Z) - SPOT: Scalable 3D Pre-training via Occupancy Prediction for Learning Transferable 3D Representations [76.45009891152178]
Pretraining-finetuning approach can alleviate the labeling burden by fine-tuning a pre-trained backbone across various downstream datasets as well as tasks.
We show, for the first time, that general representations learning can be achieved through the task of occupancy prediction.
Our findings will facilitate the understanding of LiDAR points and pave the way for future advancements in LiDAR pre-training.
arXiv Detail & Related papers (2023-09-19T11:13:01Z) - Spatio-Temporal Contrastive Self-Supervised Learning for POI-level Crowd
Flow Inference [23.8192952068949]
We present a novel Contrastive Self-learning framework for S-temporal data (CSST)
Our approach initiates with the construction of a spatial adjacency graph founded on the Points of Interest (POIs) and their respective distances.
We adopt a swapped prediction approach to anticipate the representation of the target subgraph from similar instances.
Our experiments, conducted on two real-world datasets, demonstrate that the CSST pre-trained on extensive noisy data consistently outperforms models trained from scratch.
arXiv Detail & Related papers (2023-09-06T02:51:24Z) - Pre-training General Trajectory Embeddings with Maximum Multi-view
Entropy Coding [36.18788551389281]
Trajectory embeddings can improve task performance but may incur high computational costs and face limited training data availability.
Existing trajectory embedding methods face difficulties in learning general embeddings due to biases towards certain downstream tasks.
We propose Multi-view Trajectory Entropy Coding Coding (MMTEC) for learning general comprehensive trajectory embeddings.
arXiv Detail & Related papers (2022-07-29T08:16:20Z) - PreTraM: Self-Supervised Pre-training via Connecting Trajectory and Map [58.53373202647576]
We propose PreTraM, a self-supervised pre-training scheme for trajectory forecasting.
It consists of two parts: 1) Trajectory-Map Contrastive Learning, where we project trajectories and maps to a shared embedding space with cross-modal contrastive learning, and 2) Map Contrastive Learning, where we enhance map representation with contrastive learning on large quantities of HD-maps.
On top of popular baselines such as AgentFormer and Trajectron++, PreTraM boosts their performance by 5.5% and 6.9% relatively in FDE-10 on the challenging nuScenes dataset.
arXiv Detail & Related papers (2022-04-21T23:01:21Z) - PnPNet: End-to-End Perception and Prediction with Tracking in the Loop [82.97006521937101]
We tackle the problem of joint perception and motion forecasting in the context of self-driving vehicles.
We propose Net, an end-to-end model that takes as input sensor data, and outputs at each time step object tracks and their future level.
arXiv Detail & Related papers (2020-05-29T17:57:25Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.