Flight Demand Forecasting with Transformers
- URL: http://arxiv.org/abs/2111.04471v1
- Date: Thu, 4 Nov 2021 22:00:12 GMT
- Title: Flight Demand Forecasting with Transformers
- Authors: Liya Wang, Amy Mykityshyn, Craig Johnson, Jillian Cheng
- Abstract summary: This research strives to improve prediction accuracy from two key aspects: better data sources and robust forecasting algorithms.
Inspired by the success of transformers, we adopted this technique to predict strategic flight departure demand in multiple horizons.
Case studies show that TFTs can perform better than traditional forecasting methods by large margins.
- Score: 0.0
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Transformers have become the de-facto standard in the natural language
processing (NLP) field. They have also gained momentum in computer vision and
other domains. Transformers can enable artificial intelligence (AI) models to
dynamically focus on certain parts of their input and thus reason more
effectively. Inspired by the success of transformers, we adopted this technique
to predict strategic flight departure demand in multiple horizons. This work
was conducted in support of a MITRE-developed mobile application, Pacer, which
displays predicted departure demand to general aviation (GA) flight operators
so they can have better situation awareness of the potential for departure
delays during busy periods. Field demonstrations involving Pacer's previously
designed rule-based prediction method showed that the prediction accuracy of
departure demand still has room for improvement. This research strives to
improve prediction accuracy from two key aspects: better data sources and
robust forecasting algorithms. We leveraged two data sources, Aviation System
Performance Metrics (ASPM) and System Wide Information Management (SWIM), as
our input. We then trained forecasting models with temporal fusion transformer
(TFT) for five different airports. Case studies show that TFTs can perform
better than traditional forecasting methods by large margins, and they can
result in better prediction across diverse airports and with better
interpretability.
Related papers
- Unveil Benign Overfitting for Transformer in Vision: Training Dynamics, Convergence, and Generalization [88.5582111768376]
We study the optimization of a Transformer composed of a self-attention layer with softmax followed by a fully connected layer under gradient descent on a certain data distribution model.
Our results establish a sharp condition that can distinguish between the small test error phase and the large test error regime, based on the signal-to-noise ratio in the data model.
arXiv Detail & Related papers (2024-09-28T13:24:11Z) - Amelia: A Large Model and Dataset for Airport Surface Movement Forecasting [12.684598713362007]
Amelia-48 is a large surface movement dataset collected using the System Wide Information Management (SWIM) Surface Movement Event Service (SMES)
Amelia-TF is a transformer-based next-token-prediction large multi-agent multi-airport trajectory forecasting model trained on 292 days.
It is validated on unseen airports with experiments showcasing the different prediction horizon lengths, ego-agent selection strategies, and training recipes.
arXiv Detail & Related papers (2024-07-30T20:50:48Z) - Forecast-PEFT: Parameter-Efficient Fine-Tuning for Pre-trained Motion Forecasting Models [68.23649978697027]
Forecast-PEFT is a fine-tuning strategy that freezes the majority of the model's parameters, focusing adjustments on newly introduced prompts and adapters.
Our experiments show that Forecast-PEFT outperforms traditional full fine-tuning methods in motion prediction tasks.
Forecast-FT further improves prediction performance, evidencing up to a 9.6% enhancement over conventional baseline methods.
arXiv Detail & Related papers (2024-07-28T19:18:59Z) - Airport Delay Prediction with Temporal Fusion Transformers [24.280246809961945]
This study proposes to apply the novel Temporal Fusion Transformer model and predict numerical airport arrival delays at quarter hour level for U.S. top 30 airports.
Inputs to our model include airport demand and capacity forecasts, historic airport operation efficiency information, airport wind and visibility conditions, as well as enroute weather and traffic conditions.
arXiv Detail & Related papers (2024-05-14T03:27:15Z) - Emergent Agentic Transformer from Chain of Hindsight Experience [96.56164427726203]
We show that a simple transformer-based model performs competitively with both temporal-difference and imitation-learning-based approaches.
This is the first time that a simple transformer-based model performs competitively with both temporal-difference and imitation-learning-based approaches.
arXiv Detail & Related papers (2023-05-26T00:43:02Z) - Multi-Airport Delay Prediction with Transformers [0.0]
Temporal Fusion Transformer (TFT) was proposed to predict departure and arrival delays simultaneously for multiple airports.
This approach can capture complex temporal dynamics of the inputs known at the time of prediction and then forecast selected delay metrics up to four hours into the future.
arXiv Detail & Related papers (2021-11-04T21:58:11Z) - An Empirical Study of Training End-to-End Vision-and-Language
Transformers [50.23532518166621]
We present METER(textbfMultimodal textbfEnd-to-end textbfTransformtextbfER), through which we investigate how to design and pre-train a fully transformer-based VL model.
Specifically, we dissect the model designs along multiple dimensions: vision encoders (e.g., CLIP-ViT, Swin transformer), text encoders (e.g., RoBERTa, DeBERTa), multimodal fusion (e.g., merged attention vs. co-
arXiv Detail & Related papers (2021-11-03T17:55:36Z) - Transformers for prompt-level EMA non-response prediction [62.41658786277712]
Ecological Momentary Assessments (EMAs) are an important psychological data source for measuring cognitive states, affect, behavior, and environmental factors.
Non-response, in which participants fail to respond to EMA prompts, is an endemic problem.
The ability to accurately predict non-response could be utilized to improve EMA delivery and develop compliance interventions.
arXiv Detail & Related papers (2021-11-01T18:38:47Z) - Efficient pre-training objectives for Transformers [84.64393460397471]
We study several efficient pre-training objectives for Transformers-based models.
We prove that eliminating the MASK token and considering the whole output during the loss are essential choices to improve performance.
arXiv Detail & Related papers (2021-04-20T00:09:37Z) - Spatio-Temporal Data Mining for Aviation Delay Prediction [15.621546618044173]
We present a novel aviation delay prediction system based on stacked Long Short-Term Memory (LSTM) networks for commercial flights.
The system learns from historical trajectories from automatic dependent surveillance-broadcast (ADS-B) messages.
Compared with previous schemes, our approach is demonstrated to be more robust and accurate for large hub airports.
arXiv Detail & Related papers (2021-03-20T18:37:06Z) - Deep Learning for Flight Demand Forecasting [0.0]
This research strives to improve prediction accuracy from two key aspects: better data sources and robust forecasting algorithms.
We trained forecasting models with DL techniques of sequence to sequence (seq2seq) and seq2seq with attention.
With better data sources, seq2seq with attention can reduce mean squared error (mse) over 60%, compared to the classical autoregressive (AR) forecasting method.
arXiv Detail & Related papers (2020-11-06T16:46:19Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.