Short-Term Multi-Horizon Line Loss Rate Forecasting of a Distribution
Network Using Attention-GCN-LSTM
- URL: http://arxiv.org/abs/2312.11898v1
- Date: Tue, 19 Dec 2023 06:47:22 GMT
- Title: Short-Term Multi-Horizon Line Loss Rate Forecasting of a Distribution
Network Using Attention-GCN-LSTM
- Authors: Jie Liu, Yijia Cao, Yong Li, Yixiu Guo, and Wei Deng
- Abstract summary: We propose Attention-GCN-LSTM, a novel method that combines Graph Convolutional Networks (GCN), Long Short-Term Memory (LSTM) and a three-level attention mechanism.
Our model enables accurate forecasting of line loss rates across multiple horizons.
- Score: 9.460123100630158
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Accurately predicting line loss rates is vital for effective line loss
management in distribution networks, especially over short-term multi-horizons
ranging from one hour to one week. In this study, we propose
Attention-GCN-LSTM, a novel method that combines Graph Convolutional Networks
(GCN), Long Short-Term Memory (LSTM), and a three-level attention mechanism to
address this challenge. By capturing spatial and temporal dependencies, our
model enables accurate forecasting of line loss rates across multiple horizons.
Through comprehensive evaluation using real-world data from 10KV feeders, our
Attention-GCN-LSTM model consistently outperforms existing algorithms,
exhibiting superior performance in terms of prediction accuracy and
multi-horizon forecasting. This model holds significant promise for enhancing
line loss management in distribution networks.
Related papers
- Satellite Federated Edge Learning: Architecture Design and Convergence Analysis [47.057886812985984]
This paper introduces a novel FEEL algorithm, named FEDMEGA, tailored to mega-constellation networks.
By integrating inter-satellite links (ISL) for intra-orbit model aggregation, the proposed algorithm significantly reduces the usage of low data rate and intermittent GSL.
Our proposed method includes a ring all-reduce based intra-orbit aggregation mechanism, coupled with a network flow-based transmission scheme for global model aggregation.
arXiv Detail & Related papers (2024-04-02T11:59:58Z) - PreRoutGNN for Timing Prediction with Order Preserving Partition: Global
Circuit Pre-training, Local Delay Learning and Attentional Cell Modeling [84.34811206119619]
We propose a two-stage approach to pre-routing timing prediction.
First, we propose global circuit training to pre-train a graph auto-encoder that learns the global graph embedding from circuit netlist.
Second, we use a novel node updating scheme for message passing on GCN, following the topological sorting sequence of the learned graph embedding and circuit graph.
Experiments on 21 real world circuits achieve a new SOTA R2 of 0.93 for slack prediction, which is significantly surpasses 0.59 by previous SOTA method.
arXiv Detail & Related papers (2024-02-27T02:23:07Z) - 1D-CapsNet-LSTM: A Deep Learning-Based Model for Multi-Step Stock Index
Forecasting [6.05458608266581]
This study investigates the potential of integrating a 1D CapsNet with an LSTM network for multi-step stock index forecasting.
To this end, a hybrid 1D-CapsNet-LSTM model is introduced, which utilizes a 1D CapsNet to generate high-level capsules.
The proposed 1D-CapsNet-LSTM model consistently outperforms baseline models in two key aspects.
arXiv Detail & Related papers (2023-10-03T14:33:34Z) - Optimization Guarantees of Unfolded ISTA and ADMM Networks With Smooth
Soft-Thresholding [57.71603937699949]
We study optimization guarantees, i.e., achieving near-zero training loss with the increase in the number of learning epochs.
We show that the threshold on the number of training samples increases with the increase in the network width.
arXiv Detail & Related papers (2023-09-12T13:03:47Z) - STG-GAN: A spatiotemporal graph generative adversarial networks for
short-term passenger flow prediction in urban rail transit systems [11.167132464665578]
Short-term passenger flow prediction is an important but challenging task for better managing urban rail transit systems.
We propose a novel deep learning-basedtemporal graph generative adversarial network (STG-GAN) model with higher prediction accuracy, higher efficiency, and lower memory occupancy.
This study can provide critical experience in conducting short-term passenger flow predictions, especially from the perspective of real-world applications.
arXiv Detail & Related papers (2022-02-10T13:18:11Z) - Long Short-Term Memory Neural Network for Financial Time Series [0.0]
We present an ensemble of independent and parallel long short-term memory neural networks for the prediction of stock price movement.
With a straightforward trading strategy, comparisons with a randomly chosen portfolio and a portfolio containing all the stocks in the index show that the portfolio resulting from the LSTM ensemble provides better average daily returns and higher cumulative returns over time.
arXiv Detail & Related papers (2022-01-20T15:17:26Z) - Semi-supervised Network Embedding with Differentiable Deep Quantisation [81.49184987430333]
We develop d-SNEQ, a differentiable quantisation method for network embedding.
d-SNEQ incorporates a rank loss to equip the learned quantisation codes with rich high-order information.
It is able to substantially compress the size of trained embeddings, thus reducing storage footprint and accelerating retrieval speed.
arXiv Detail & Related papers (2021-08-20T11:53:05Z) - Short-Term Electricity Price Forecasting based on Graph Convolution
Network and Attention Mechanism [5.331757100806177]
This paper tailors a spectral graph convolutional network (GCN) to greatly improve the accuracy of short-term LMP forecasting.
A three-branch network structure is then designed to match the structure of LMPs' compositions.
Case studies based on the IEEE-118 test system and real-world data from the PJM validate that the proposed model outperforms existing forecasting models in accuracy.
arXiv Detail & Related papers (2021-07-26T15:44:07Z) - Low-Rank Temporal Attention-Augmented Bilinear Network for financial
time-series forecasting [93.73198973454944]
Deep learning models have led to significant performance improvements in many problems coming from different domains, including prediction problems of financial time-series data.
The Temporal Attention-Augmented Bilinear network was recently proposed as an efficient and high-performing model for Limit Order Book time-series forecasting.
In this paper, we propose a low-rank tensor approximation of the model to further reduce the number of trainable parameters and increase its speed.
arXiv Detail & Related papers (2021-07-05T10:15:23Z) - Deep Stock Predictions [58.720142291102135]
We consider the design of a trading strategy that performs portfolio optimization using Long Short Term Memory (LSTM) neural networks.
We then customize the loss function used to train the LSTM to increase the profit earned.
We find the LSTM model with the customized loss function to have an improved performance in the training bot over a regressive baseline such as ARIMA.
arXiv Detail & Related papers (2020-06-08T23:37:47Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.