Flight Demand Forecasting with Transformers
- URL: http://arxiv.org/abs/2111.04471v1
- Date: Thu, 4 Nov 2021 22:00:12 GMT
- Title: Flight Demand Forecasting with Transformers
- Authors: Liya Wang, Amy Mykityshyn, Craig Johnson, Jillian Cheng
- Abstract summary: This research strives to improve prediction accuracy from two key aspects: better data sources and robust forecasting algorithms.
Inspired by the success of transformers, we adopted this technique to predict strategic flight departure demand in multiple horizons.
Case studies show that TFTs can perform better than traditional forecasting methods by large margins.
- Score: 0.0
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Transformers have become the de-facto standard in the natural language
processing (NLP) field. They have also gained momentum in computer vision and
other domains. Transformers can enable artificial intelligence (AI) models to
dynamically focus on certain parts of their input and thus reason more
effectively. Inspired by the success of transformers, we adopted this technique
to predict strategic flight departure demand in multiple horizons. This work
was conducted in support of a MITRE-developed mobile application, Pacer, which
displays predicted departure demand to general aviation (GA) flight operators
so they can have better situation awareness of the potential for departure
delays during busy periods. Field demonstrations involving Pacer's previously
designed rule-based prediction method showed that the prediction accuracy of
departure demand still has room for improvement. This research strives to
improve prediction accuracy from two key aspects: better data sources and
robust forecasting algorithms. We leveraged two data sources, Aviation System
Performance Metrics (ASPM) and System Wide Information Management (SWIM), as
our input. We then trained forecasting models with temporal fusion transformer
(TFT) for five different airports. Case studies show that TFTs can perform
better than traditional forecasting methods by large margins, and they can
result in better prediction across diverse airports and with better
interpretability.
Related papers
- Airport Delay Prediction with Temporal Fusion Transformers [23.20853131797729]
This study proposes to apply the novel Temporal Fusion Transformer model and predict numerical airport arrival delays at quarter hour level for U.S. top 30 airports.
Inputs to our model include airport demand and capacity forecasts, historic airport operation efficiency information, airport wind and visibility conditions, as well as enroute weather and traffic conditions.
arXiv Detail & Related papers (2024-05-14T03:27:15Z) - AMP: Autoregressive Motion Prediction Revisited with Next Token Prediction for Autonomous Driving [59.94343412438211]
We introduce the GPT style next token motion prediction into motion prediction.
Different from language data which is composed of homogeneous units -words, the elements in the driving scene could have complex spatial-temporal and semantic relations.
We propose to adopt three factorized attention modules with different neighbors for information aggregation and different position encoding styles to capture their relations.
arXiv Detail & Related papers (2024-03-20T06:22:37Z) - Humanoid Locomotion as Next Token Prediction [84.21335675130021]
Our model is a causal transformer trained via autoregressive prediction of sensorimotor trajectories.
We show that our model enables a full-sized humanoid to walk in San Francisco zero-shot.
Our model can transfer to the real world even when trained on only 27 hours of walking data, and can generalize commands not seen during training like walking backward.
arXiv Detail & Related papers (2024-02-29T18:57:37Z) - Multi-Agent Based Transfer Learning for Data-Driven Air Traffic
Applications [1.588400000775528]
This paper proposes a Multi-Agent Bidirectional Representations from Transformers (MA-BERT) model that fully considers the multi-agent characteristic of the ATM system and learns air traffic controllers' decisions.
By pre-training the MA-BERT on a large dataset from a major airport and then fine-tuning it to other airports and specific air traffic applications, a large amount of the total training time can be saved.
arXiv Detail & Related papers (2024-01-23T22:21:07Z) - Emergent Agentic Transformer from Chain of Hindsight Experience [96.56164427726203]
We show that a simple transformer-based model performs competitively with both temporal-difference and imitation-learning-based approaches.
This is the first time that a simple transformer-based model performs competitively with both temporal-difference and imitation-learning-based approaches.
arXiv Detail & Related papers (2023-05-26T00:43:02Z) - Multi-Airport Delay Prediction with Transformers [0.0]
Temporal Fusion Transformer (TFT) was proposed to predict departure and arrival delays simultaneously for multiple airports.
This approach can capture complex temporal dynamics of the inputs known at the time of prediction and then forecast selected delay metrics up to four hours into the future.
arXiv Detail & Related papers (2021-11-04T21:58:11Z) - An Empirical Study of Training End-to-End Vision-and-Language
Transformers [50.23532518166621]
We present METER(textbfMultimodal textbfEnd-to-end textbfTransformtextbfER), through which we investigate how to design and pre-train a fully transformer-based VL model.
Specifically, we dissect the model designs along multiple dimensions: vision encoders (e.g., CLIP-ViT, Swin transformer), text encoders (e.g., RoBERTa, DeBERTa), multimodal fusion (e.g., merged attention vs. co-
arXiv Detail & Related papers (2021-11-03T17:55:36Z) - Transformers for prompt-level EMA non-response prediction [62.41658786277712]
Ecological Momentary Assessments (EMAs) are an important psychological data source for measuring cognitive states, affect, behavior, and environmental factors.
Non-response, in which participants fail to respond to EMA prompts, is an endemic problem.
The ability to accurately predict non-response could be utilized to improve EMA delivery and develop compliance interventions.
arXiv Detail & Related papers (2021-11-01T18:38:47Z) - Efficient pre-training objectives for Transformers [84.64393460397471]
We study several efficient pre-training objectives for Transformers-based models.
We prove that eliminating the MASK token and considering the whole output during the loss are essential choices to improve performance.
arXiv Detail & Related papers (2021-04-20T00:09:37Z) - Spatio-Temporal Data Mining for Aviation Delay Prediction [15.621546618044173]
We present a novel aviation delay prediction system based on stacked Long Short-Term Memory (LSTM) networks for commercial flights.
The system learns from historical trajectories from automatic dependent surveillance-broadcast (ADS-B) messages.
Compared with previous schemes, our approach is demonstrated to be more robust and accurate for large hub airports.
arXiv Detail & Related papers (2021-03-20T18:37:06Z) - Deep Learning for Flight Demand Forecasting [0.0]
This research strives to improve prediction accuracy from two key aspects: better data sources and robust forecasting algorithms.
We trained forecasting models with DL techniques of sequence to sequence (seq2seq) and seq2seq with attention.
With better data sources, seq2seq with attention can reduce mean squared error (mse) over 60%, compared to the classical autoregressive (AR) forecasting method.
arXiv Detail & Related papers (2020-11-06T16:46:19Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.