Coherent Load Profile Synthesis with Conditional Diffusion for LV Distribution Network Scenario Generation
- URL: http://arxiv.org/abs/2510.12832v1
- Date: Mon, 13 Oct 2025 08:40:39 GMT
- Title: Coherent Load Profile Synthesis with Conditional Diffusion for LV Distribution Network Scenario Generation
- Authors: Alistair Brash, Junyi Lu, Bruce Stephen, Blair Brown, Robert Atkinson, Craig Michie, Fraser MacIntyre, Christos Tachtatzis,
- Abstract summary: Load profiling approaches often rely on summarising demand through typical profiles.<n>Co-behaviour between substations, which ultimately impacts higher voltage level network operation is often overlooked.<n>A Conditional Diffusion model for synthesising daily active and reactive power profiles at the low voltage distribution substation level is proposed.
- Score: 1.9248772611306222
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Limited visibility of power distribution network power flows at the low voltage level presents challenges to both distribution network operators from a planning perspective and distribution system operators from a congestion management perspective. Forestalling these challenges through scenario analysis is confounded by the lack of realistic and coherent load data across representative distribution feeders. Load profiling approaches often rely on summarising demand through typical profiles, which oversimplifies the complexity of substation-level operations and limits their applicability in specific power system studies. Sampling methods, and more recently generative models, have attempted to address this through synthesising representative loads from historical exemplars; however, while these approaches can approximate load shapes to a convincing degree of fidelity, the co-behaviour between substations, which ultimately impacts higher voltage level network operation, is often overlooked. This limitation will become even more pronounced with the increasing integration of low-carbon technologies, as estimates of base loads fail to capture load diversity. To address this gap, a Conditional Diffusion model for synthesising daily active and reactive power profiles at the low voltage distribution substation level is proposed. The evaluation of fidelity is demonstrated through conventional metrics capturing temporal and statistical realism, as well as power flow modelling. The results show synthesised load profiles are plausible both independently and as a cohort in a wider power systems context. The Conditional Diffusion model is benchmarked against both naive and state-of-the-art models to demonstrate its effectiveness in producing realistic scenarios on which to base sub-regional power distribution network planning and operations.
Related papers
- Learning a Generalized Model for Substation Level Voltage Estimation in Distribution Networks [0.0]
This paper presents a hierarchical graph neural network for substation-level voltage estimation.<n>The model is trained and evaluated on thousands of buses across multiple substations and DER penetration scenarios.<n>Experiments demonstrate that the proposed method achieves up to 2 times lower RMSE than alternative data-driven models.
arXiv Detail & Related papers (2025-10-17T02:44:25Z) - PowerGrow: Feasible Co-Growth of Structures and Dynamics for Power Grid Synthesis [75.14189839277928]
We present PowerGrow, a co-generative framework that significantly reduces computational overhead while maintaining operational validity.<n> Experiments across benchmark settings show that PowerGrow outperforms prior diffusion models in fidelity and diversity.<n>This demonstrates its ability to generate operationally valid and realistic power grid scenarios.
arXiv Detail & Related papers (2025-08-29T01:47:27Z) - Constrained Diffusion Models for Synthesizing Representative Power Flow Datasets [0.0]
High-quality power flow datasets are essential for training machine learning models in power systems.<n>Security and privacy concerns restrict access to real-world data.<n>We develop a diffusion model for generating synthetic power flow datasets from real-world power grids.
arXiv Detail & Related papers (2025-06-12T20:39:28Z) - Consistent World Models via Foresight Diffusion [56.45012929930605]
We argue that a key bottleneck in learning consistent diffusion-based world models lies in the suboptimal predictive ability.<n>We propose Foresight Diffusion (ForeDiff), a diffusion-based world modeling framework that enhances consistency by decoupling condition understanding from target denoising.
arXiv Detail & Related papers (2025-05-22T10:01:59Z) - Parallelly Tempered Generative Adversarial Nets: Toward Stabilized Gradients [7.94957965474334]
A generative adversarial network (GAN) has been a representative backbone model in generative artificial intelligence (AI)<n>This work analyzes the training instability and inefficiency in the presence of mode collapse by linking it to multimodality in the target distribution.<n>With our newly developed GAN objective function, the generator can learn all the tempered distributions simultaneously.
arXiv Detail & Related papers (2024-11-18T18:01:13Z) - Predicting Cascading Failures with a Hyperparametric Diffusion Model [66.89499978864741]
We study cascading failures in power grids through the lens of diffusion models.
Our model integrates viral diffusion principles with physics-based concepts.
We show that this diffusion model can be learned from traces of cascading failures.
arXiv Detail & Related papers (2024-06-12T02:34:24Z) - Distributional Refinement Network: Distributional Forecasting via Deep Learning [0.8142555609235358]
A key task in actuarial modelling involves modelling the distributional properties of losses.
We propose a Distributional Refinement Network (DRN), which combines an inherently interpretable baseline model with a flexible neural network.
DRN captures varying effects of features across all quantiles, improving predictive performance while maintaining adequate interpretability.
arXiv Detail & Related papers (2024-06-03T05:14:32Z) - Towards Theoretical Understandings of Self-Consuming Generative Models [56.84592466204185]
This paper tackles the emerging challenge of training generative models within a self-consuming loop.
We construct a theoretical framework to rigorously evaluate how this training procedure impacts the data distributions learned by future models.
We present results for kernel density estimation, delivering nuanced insights such as the impact of mixed data training on error propagation.
arXiv Detail & Related papers (2024-02-19T02:08:09Z) - Efficient Generative Modeling via Penalized Optimal Transport Network [1.8079016557290342]
We propose a versatile deep generative model based on the marginally-penalized Wasserstein (MPW) distance.<n>Through the MPW distance, POTNet effectively leverages low-dimensional marginal information to guide the overall alignment of joint distributions.<n>We derive a non-asymptotic bound on the generalization error of the MPW loss and establish convergence rates of the generative distribution learned by POTNet.
arXiv Detail & Related papers (2024-02-16T05:27:05Z) - Towards Long-Term Time-Series Forecasting: Feature, Pattern, and
Distribution [57.71199089609161]
Long-term time-series forecasting (LTTF) has become a pressing demand in many applications, such as wind power supply planning.
Transformer models have been adopted to deliver high prediction capacity because of the high computational self-attention mechanism.
We propose an efficient Transformerbased model, named Conformer, which differentiates itself from existing methods for LTTF in three aspects.
arXiv Detail & Related papers (2023-01-05T13:59:29Z) - Probabilistic electric load forecasting through Bayesian Mixture Density
Networks [70.50488907591463]
Probabilistic load forecasting (PLF) is a key component in the extended tool-chain required for efficient management of smart energy grids.
We propose a novel PLF approach, framed on Bayesian Mixture Density Networks.
To achieve reliable and computationally scalable estimators of the posterior distributions, both Mean Field variational inference and deep ensembles are integrated.
arXiv Detail & Related papers (2020-12-23T16:21:34Z) - Generalization Properties of Optimal Transport GANs with Latent
Distribution Learning [52.25145141639159]
We study how the interplay between the latent distribution and the complexity of the pushforward map affects performance.
Motivated by our analysis, we advocate learning the latent distribution as well as the pushforward map within the GAN paradigm.
arXiv Detail & Related papers (2020-07-29T07:31:33Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.