Principal Component Density Estimation for Scenario Generation Using
Normalizing Flows
- URL: http://arxiv.org/abs/2104.10410v1
- Date: Wed, 21 Apr 2021 08:42:54 GMT
- Title: Principal Component Density Estimation for Scenario Generation Using
Normalizing Flows
- Authors: Eike Cramer, Alexander Mitsos, Raul Tempone, Manuel Dahmen
- Abstract summary: We propose a dimensionality-reducing flow layer based on the linear principal component analysis (PCA) that sets up the normalizing flow in a lower-dimensional space.
We train the resulting principal component flow (PCF) on data of PV and wind power generation as well as load demand in Germany in the years 2013 to 2015.
- Score: 62.997667081978825
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Neural networks-based learning of the distribution of non-dispatchable
renewable electricity generation from sources such as photovoltaics (PV) and
wind as well as load demands has recently gained attention. Normalizing flow
density models have performed particularly well in this task due to the
training through direct log-likelihood maximization. However, research from the
field of image generation has shown that standard normalizing flows can only
learn smeared-out versions of manifold distributions and can result in the
generation of noisy data. To avoid the generation of time series data with
unrealistic noise, we propose a dimensionality-reducing flow layer based on the
linear principal component analysis (PCA) that sets up the normalizing flow in
a lower-dimensional space. We train the resulting principal component flow
(PCF) on data of PV and wind power generation as well as load demand in Germany
in the years 2013 to 2015. The results of this investigation show that the PCF
preserves critical features of the original distributions, such as the
probability density and frequency behavior of the time series. The application
of the PCF is, however, not limited to renewable power generation but rather
extends to any data set, time series, or otherwise, which can be efficiently
reduced using PCA.
Related papers
- A Flow-Based Model for Conditional and Probabilistic Electricity Consumption Profile Generation and Prediction [3.121498988556879]
This paper introduces a novel flow-based generative model, termed Full Convolutional Profile Flow (FCPFlow)
FCPFlow is designed for both conditional and unconditional RLP generation, and for probabilistic load forecasting.
arXiv Detail & Related papers (2024-05-03T15:27:51Z) - Designing losses for data-free training of normalizing flows on
Boltzmann distributions [0.0]
We analyze the properties of standard losses based on Kullback-Leibler divergences.
We propose strategies to alleviate these issues, most importantly a new loss function well-grounded in theory.
We show on several tasks that, for the first time, imperfect pre-trained models can be further optimized in the absence of training data.
arXiv Detail & Related papers (2023-01-13T10:56:13Z) - Towards Long-Term Time-Series Forecasting: Feature, Pattern, and
Distribution [57.71199089609161]
Long-term time-series forecasting (LTTF) has become a pressing demand in many applications, such as wind power supply planning.
Transformer models have been adopted to deliver high prediction capacity because of the high computational self-attention mechanism.
We propose an efficient Transformerbased model, named Conformer, which differentiates itself from existing methods for LTTF in three aspects.
arXiv Detail & Related papers (2023-01-05T13:59:29Z) - ManiFlow: Implicitly Representing Manifolds with Normalizing Flows [145.9820993054072]
Normalizing Flows (NFs) are flexible explicit generative models that have been shown to accurately model complex real-world data distributions.
We propose an optimization objective that recovers the most likely point on the manifold given a sample from the perturbed distribution.
Finally, we focus on 3D point clouds for which we utilize the explicit nature of NFs, i.e. surface normals extracted from the gradient of the log-likelihood and the log-likelihood itself.
arXiv Detail & Related papers (2022-08-18T16:07:59Z) - Matching Normalizing Flows and Probability Paths on Manifolds [57.95251557443005]
Continuous Normalizing Flows (CNFs) are generative models that transform a prior distribution to a model distribution by solving an ordinary differential equation (ODE)
We propose to train CNFs by minimizing probability path divergence (PPD), a novel family of divergences between the probability density path generated by the CNF and a target probability density path.
We show that CNFs learned by minimizing PPD achieve state-of-the-art results in likelihoods and sample quality on existing low-dimensional manifold benchmarks.
arXiv Detail & Related papers (2022-07-11T08:50:19Z) - Nonlinear Isometric Manifold Learning for Injective Normalizing Flows [58.720142291102135]
We use isometries to separate manifold learning and density estimation.
We also employ autoencoders to design embeddings with explicit inverses that do not distort the probability distribution.
arXiv Detail & Related papers (2022-03-08T08:57:43Z) - Neural Network-based Power Flow Model [0.0]
A neural network (NN) model is trained to predict power flow results using historical power system data.
It can be concluded that the proposed NN-based power flow model can find solutions quickly and more accurately than DC power flow model.
arXiv Detail & Related papers (2021-12-15T19:05:53Z) - Learning Likelihoods with Conditional Normalizing Flows [54.60456010771409]
Conditional normalizing flows (CNFs) are efficient in sampling and inference.
We present a study of CNFs where the base density to output space mapping is conditioned on an input x, to model conditional densities p(y|x)
arXiv Detail & Related papers (2019-11-29T19:17:58Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.