TrackGPT -- A generative pre-trained transformer for cross-domain entity
trajectory forecasting
- URL: http://arxiv.org/abs/2402.00066v1
- Date: Mon, 29 Jan 2024 20:05:14 GMT
- Title: TrackGPT -- A generative pre-trained transformer for cross-domain entity
trajectory forecasting
- Authors: Nicholas Stroh
- Abstract summary: We introduce TrackGPT, a Generative Pre-trained Transformer (GPT)-based model for entity trajectory forecasting.
TrackGPT stands as a pioneering GPT model capable of producing accurate predictions across diverse entity time series datasets.
We present benchmarks against state-of-the-art deep learning techniques, showing that TrackGPT's forecasting capability excels in terms of accuracy, reliability, and modularity.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The forecasting of entity trajectories at future points in time is a critical
capability gap in applications across both Commercial and Defense sectors.
Transformers, and specifically Generative Pre-trained Transformer (GPT)
networks have recently revolutionized several fields of Artificial
Intelligence, most notably Natural Language Processing (NLP) with the advent of
Large Language Models (LLM) like OpenAI's ChatGPT. In this research paper, we
introduce TrackGPT, a GPT-based model for entity trajectory forecasting that
has shown utility across both maritime and air domains, and we expect to
perform well in others. TrackGPT stands as a pioneering GPT model capable of
producing accurate predictions across diverse entity time series datasets,
demonstrating proficiency in generating both long-term forecasts with sustained
accuracy and short-term forecasts with high precision. We present benchmarks
against state-of-the-art deep learning techniques, showing that TrackGPT's
forecasting capability excels in terms of accuracy, reliability, and
modularity. Importantly, TrackGPT achieves these results while remaining
domain-agnostic and requiring minimal data features (only location and time)
compared to models achieving similar performance. In conclusion, our findings
underscore the immense potential of applying GPT architectures to the task of
entity trajectory forecasting, exemplified by the innovative TrackGPT model.
Related papers
- Tackling Data Heterogeneity in Federated Time Series Forecasting [61.021413959988216]
Time series forecasting plays a critical role in various real-world applications, including energy consumption prediction, disease transmission monitoring, and weather forecasting.
Most existing methods rely on a centralized training paradigm, where large amounts of data are collected from distributed devices to a central cloud server.
We propose a novel framework, Fed-TREND, to address data heterogeneity by generating informative synthetic data as auxiliary knowledge carriers.
arXiv Detail & Related papers (2024-11-24T04:56:45Z) - Generalizing Weather Forecast to Fine-grained Temporal Scales via Physics-AI Hybrid Modeling [55.13352174687475]
This paper proposes a physics-AI hybrid model (i.e., WeatherGFT) which Generalizes weather forecasts to Finer-grained Temporal scales.
Specifically, we employ a carefully designed PDE kernel to simulate physical evolution on a small time scale.
We introduce a lead time-aware training framework to promote the generalization of the model at different lead times.
arXiv Detail & Related papers (2024-05-22T16:21:02Z) - TimeGPT in Load Forecasting: A Large Time Series Model Perspective [38.92798207166188]
Machine learning models have made significant progress in load forecasting, but their forecast accuracy is limited in cases where historical load data is scarce.
This paper aims to discuss the potential of large time series models in load forecasting with scarce historical data.
arXiv Detail & Related papers (2024-04-07T09:05:09Z) - Cumulative Distribution Function based General Temporal Point Processes [49.758080415846884]
CuFun model represents a novel approach to TPPs that revolves around the Cumulative Distribution Function (CDF)
Our approach addresses several critical issues inherent in traditional TPP modeling.
Our contributions encompass the introduction of a pioneering CDF-based TPP model, the development of a methodology for incorporating past event information into future event prediction.
arXiv Detail & Related papers (2024-02-01T07:21:30Z) - JRDB-Traj: A Dataset and Benchmark for Trajectory Forecasting in Crowds [79.00975648564483]
Trajectory forecasting models, employed in fields such as robotics, autonomous vehicles, and navigation, face challenges in real-world scenarios.
This dataset provides comprehensive data, including the locations of all agents, scene images, and point clouds, all from the robot's perspective.
The objective is to predict the future positions of agents relative to the robot using raw sensory input data.
arXiv Detail & Related papers (2023-11-05T18:59:31Z) - SentimentGPT: Exploiting GPT for Advanced Sentiment Analysis and its
Departure from Current Machine Learning [5.177947445379688]
This study presents a thorough examination of various Generative Pretrained Transformer (GPT) methodologies in sentiment analysis.
Three primary strategies are employed: 1) prompt engineering using the advanced GPT-3.5 Turbo, 2) fine-tuning GPT models, and 3) an inventive approach to embedding classification.
The research yields detailed comparative insights among these strategies and individual GPT models, revealing their unique strengths and potential limitations.
arXiv Detail & Related papers (2023-07-16T05:33:35Z) - Event Stream GPT: A Data Pre-processing and Modeling Library for
Generative, Pre-trained Transformers over Continuous-time Sequences of
Complex Events [2.9330609943398525]
Event Stream GPT (ESGPT) is an open-source library designed to streamline the end-to-end process for building GPTs for continuous-time event sequences.
ESGPT allows users to build flexible, foundation-model scale input datasets by specifying only a minimal configuration file.
arXiv Detail & Related papers (2023-06-20T14:01:29Z) - GPT-FL: Generative Pre-trained Model-Assisted Federated Learning [40.522864349440674]
GPT-FL is a generative pre-trained model-assisted federated learning framework.
It consistently outperforms state-of-the-art FL methods in terms of model test accuracy, communication efficiency, and client sampling efficiency.
arXiv Detail & Related papers (2023-06-03T22:57:59Z) - Transforming Model Prediction for Tracking [109.08417327309937]
Transformers capture global relations with little inductive bias, allowing it to learn the prediction of more powerful target models.
We train the proposed tracker end-to-end and validate its performance by conducting comprehensive experiments on multiple tracking datasets.
Our tracker sets a new state of the art on three benchmarks, achieving an AUC of 68.5% on the challenging LaSOT dataset.
arXiv Detail & Related papers (2022-03-21T17:59:40Z) - Pre-Trained Models: Past, Present and Future [126.21572378910746]
Large-scale pre-trained models (PTMs) have recently achieved great success and become a milestone in the field of artificial intelligence (AI)
By storing knowledge into huge parameters and fine-tuning on specific tasks, the rich knowledge implicitly encoded in huge parameters can benefit a variety of downstream tasks.
It is now the consensus of the AI community to adopt PTMs as backbone for downstream tasks rather than learning models from scratch.
arXiv Detail & Related papers (2021-06-14T02:40:32Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.