Analytical Verification of Performance of Deep Neural Network Based
Time-Synchronized Distribution System State Estimation
- URL: http://arxiv.org/abs/2311.06973v4
- Date: Thu, 22 Feb 2024 16:33:10 GMT
- Title: Analytical Verification of Performance of Deep Neural Network Based
Time-Synchronized Distribution System State Estimation
- Authors: Behrouz Azimian, Shiva Moshtagh, Anamitra Pal, Shanshan Ma
- Abstract summary: Recently, we demonstrated success of a time-synchronized state estimator using deep neural networks (DNNs)
In this letter, we provide analytical bounds on the performance of that state estimator as a function of perturbations in the input measurements.
- Score: 0.18726646412385334
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Recently, we demonstrated success of a time-synchronized state estimator
using deep neural networks (DNNs) for real-time unobservable distribution
systems. In this letter, we provide analytical bounds on the performance of
that state estimator as a function of perturbations in the input measurements.
It has already been shown that evaluating performance based on only the test
dataset might not effectively indicate a trained DNN's ability to handle input
perturbations. As such, we analytically verify robustness and trustworthiness
of DNNs to input perturbations by treating them as mixed-integer linear
programming (MILP) problems. The ability of batch normalization in addressing
the scalability limitations of the MILP formulation is also highlighted. The
framework is validated by performing time-synchronized distribution system
state estimation for a modified IEEE 34-node system and a real-world large
distribution system, both of which are incompletely observed by micro-phasor
measurement units.
Related papers
- Uncertainty-Aware Deep Attention Recurrent Neural Network for
Heterogeneous Time Series Imputation [0.25112747242081457]
Missingness is ubiquitous in multivariate time series and poses an obstacle to reliable downstream analysis.
We propose DEep Attention Recurrent Imputation (Imputation), which jointly estimates missing values and their associated uncertainty.
Experiments show that I surpasses the SOTA in diverse imputation tasks using real-world datasets.
arXiv Detail & Related papers (2024-01-04T13:21:11Z) - Uncovering the Missing Pattern: Unified Framework Towards Trajectory
Imputation and Prediction [60.60223171143206]
Trajectory prediction is a crucial undertaking in understanding entity movement or human behavior from observed sequences.
Current methods often assume that the observed sequences are complete while ignoring the potential for missing values.
This paper presents a unified framework, the Graph-based Conditional Variational Recurrent Neural Network (GC-VRNN), which can perform trajectory imputation and prediction simultaneously.
arXiv Detail & Related papers (2023-03-28T14:27:27Z) - Scalability and Sample Efficiency Analysis of Graph Neural Networks for
Power System State Estimation [1.0499611180329804]
This paper thoroughly evaluates a phasor measurement unit-only state estimator based on graph neural networks (GNNs) applied over factor graphs.
Our results show that the GNN-based state estimator exhibits high accuracy and efficient use of data.
arXiv Detail & Related papers (2023-02-28T22:09:12Z) - Batch-Ensemble Stochastic Neural Networks for Out-of-Distribution
Detection [55.028065567756066]
Out-of-distribution (OOD) detection has recently received much attention from the machine learning community due to its importance in deploying machine learning models in real-world applications.
In this paper we propose an uncertainty quantification approach by modelling the distribution of features.
We incorporate an efficient ensemble mechanism, namely batch-ensemble, to construct the batch-ensemble neural networks (BE-SNNs) and overcome the feature collapse problem.
We show that BE-SNNs yield superior performance on several OOD benchmarks, such as the Two-Moons dataset, the FashionMNIST vs MNIST dataset, FashionM
arXiv Detail & Related papers (2022-06-26T16:00:22Z) - State and Topology Estimation for Unobservable Distribution Systems
using Deep Neural Networks [8.673621107750652]
Time-synchronized state estimation for reconfigurable distribution networks is challenging because of limited real-time observability.
This paper formulates a deep learning (DL)-based approach for topology identification (TI) and unbalanced three-phase distribution system state estimation (DSSE)
Two deep neural networks (DNNs) are trained to operate in a sequential manner for implementing TI and DSSE for systems that are incompletely observed by synchrophasor measurement devices (SMDs)
arXiv Detail & Related papers (2021-04-15T02:46:50Z) - Improving Uncertainty Calibration via Prior Augmented Data [56.88185136509654]
Neural networks have proven successful at learning from complex data distributions by acting as universal function approximators.
They are often overconfident in their predictions, which leads to inaccurate and miscalibrated probabilistic predictions.
We propose a solution by seeking out regions of feature space where the model is unjustifiably overconfident, and conditionally raising the entropy of those predictions towards that of the prior distribution of the labels.
arXiv Detail & Related papers (2021-02-22T07:02:37Z) - Time Synchronized State Estimation for Incompletely Observed
Distribution Systems Using Deep Learning Considering Realistic Measurement
Noise [1.7587442088965226]
Time-synchronized state estimation is a challenge for distribution systems because of limited real-time observability.
This paper formulates a deep learning (DL)-based approach to perform unbalanced three-phase distribution system state estimation.
arXiv Detail & Related papers (2020-11-09T09:45:30Z) - Unlabelled Data Improves Bayesian Uncertainty Calibration under
Covariate Shift [100.52588638477862]
We develop an approximate Bayesian inference scheme based on posterior regularisation.
We demonstrate the utility of our method in the context of transferring prognostic models of prostate cancer across globally diverse populations.
arXiv Detail & Related papers (2020-06-26T13:50:19Z) - Frequentist Uncertainty in Recurrent Neural Networks via Blockwise
Influence Functions [121.10450359856242]
Recurrent neural networks (RNNs) are instrumental in modelling sequential and time-series data.
Existing approaches for uncertainty quantification in RNNs are based predominantly on Bayesian methods.
We develop a frequentist alternative that: (a) does not interfere with model training or compromise its accuracy, (b) applies to any RNN architecture, and (c) provides theoretical coverage guarantees on the estimated uncertainty intervals.
arXiv Detail & Related papers (2020-06-20T22:45:32Z) - Interval Neural Networks: Uncertainty Scores [11.74565957328407]
We propose a fast, non-Bayesian method for producing uncertainty scores in the output of pre-trained deep neural networks (DNNs)
This interval neural network (INN) has interval valued parameters and propagates its input using interval arithmetic.
In numerical experiments on an image reconstruction task, we demonstrate the practical utility of INNs as a proxy for the prediction error.
arXiv Detail & Related papers (2020-03-25T18:03:51Z) - Uncertainty Estimation Using a Single Deep Deterministic Neural Network [66.26231423824089]
We propose a method for training a deterministic deep model that can find and reject out of distribution data points at test time with a single forward pass.
We scale training in these with a novel loss function and centroid updating scheme and match the accuracy of softmax models.
arXiv Detail & Related papers (2020-03-04T12:27:36Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.