An attempt to generate new bridge types from latent space of generative
flow
- URL: http://arxiv.org/abs/2401.10299v1
- Date: Thu, 18 Jan 2024 06:26:44 GMT
- Title: An attempt to generate new bridge types from latent space of generative
flow
- Authors: Hongjun Zhang
- Abstract summary: The basic principle of normalizing flow is introduced in a simple and concise manner.
Treating the dataset as a sample from the population, obtaining normalizing flow is essentially through sampling surveys.
Using symmetric structured image dataset of three-span beam bridge, arch bridge, cable-stayed bridge and reversible suspension bridge, constructing and normalizing flow based on the Glow API in the library.
- Score: 2.05750372679553
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Through examples of coordinate and probability transformation between
different distributions, the basic principle of normalizing flow is introduced
in a simple and concise manner. From the perspective of the distribution of
random variable function, the essence of probability transformation is
explained, and the scaling factor Jacobian determinant of probability
transformation is introduced. Treating the dataset as a sample from the
population, obtaining normalizing flow is essentially through sampling surveys
to statistically infer the numerical features of the population, and then the
loss function is established by using the maximum likelihood estimation method.
This article introduces how normalizing flow cleverly solves the two major
application challenges of high-dimensional matrix determinant calculation and
neural network reversible transformation. Using symmetric structured image
dataset of three-span beam bridge, arch bridge, cable-stayed bridge and
suspension bridge, constructing and training normalizing flow based on the Glow
API in the TensorFlow Probability library. The model can smoothly transform the
complex distribution of the bridge dataset into a standard normal distribution,
and from the obtained latent space sampling, it can generate new bridge types
that are different from the training dataset.
Related papers
- Unsupervised Representation Learning from Sparse Transformation Analysis [79.94858534887801]
We propose to learn representations from sequence data by factorizing the transformations of the latent variables into sparse components.
Input data are first encoded as distributions of latent activations and subsequently transformed using a probability flow model.
arXiv Detail & Related papers (2024-10-07T23:53:25Z) - Arbitrary Distributions Mapping via SyMOT-Flow: A Flow-based Approach Integrating Maximum Mean Discrepancy and Optimal Transport [2.7309692684728617]
We introduce a novel model called SyMOT-Flow that trains an invertible transformation by minimizing the symmetric maximum mean discrepancy between samples from two unknown distributions.
The resulting transformation leads to more stable and accurate sample generation.
arXiv Detail & Related papers (2023-08-26T08:39:16Z) - Normalizing flow sampling with Langevin dynamics in the latent space [12.91637880428221]
Normalizing flows (NF) use a continuous generator to map a simple latent (e.g. Gaussian) distribution, towards an empirical target distribution associated with a training data set.
Since standard NF implement differentiable maps, they may suffer from pathological behaviors when targeting complex distributions.
This paper proposes a new Markov chain Monte Carlo algorithm to sample from the target distribution in the latent domain before transporting it back to the target domain.
arXiv Detail & Related papers (2023-05-20T09:31:35Z) - Conditioning Normalizing Flows for Rare Event Sampling [61.005334495264194]
We propose a transition path sampling scheme based on neural-network generated configurations.
We show that this approach enables the resolution of both the thermodynamics and kinetics of the transition region.
arXiv Detail & Related papers (2022-07-29T07:56:10Z) - Nonlinear Isometric Manifold Learning for Injective Normalizing Flows [58.720142291102135]
We use isometries to separate manifold learning and density estimation.
We also employ autoencoders to design embeddings with explicit inverses that do not distort the probability distribution.
arXiv Detail & Related papers (2022-03-08T08:57:43Z) - Efficient CDF Approximations for Normalizing Flows [64.60846767084877]
We build upon the diffeomorphic properties of normalizing flows to estimate the cumulative distribution function (CDF) over a closed region.
Our experiments on popular flow architectures and UCI datasets show a marked improvement in sample efficiency as compared to traditional estimators.
arXiv Detail & Related papers (2022-02-23T06:11:49Z) - Resampling Base Distributions of Normalizing Flows [0.0]
We introduce a base distribution for normalizing flows based on learned rejection sampling.
We develop suitable learning algorithms using both maximizing the log-likelihood and the optimization of the reverse Kullback-Leibler divergence.
arXiv Detail & Related papers (2021-10-29T14:44:44Z) - Particle Filter Bridge Interpolation [0.0]
We build on a previously introduced method for generating dimension independents.
We introduce a discriminator network that accurately identifies areas of high representation density.
The resulting sampling procedure allows for greater variability in paths and stronger drift towards areas of high data density.
arXiv Detail & Related papers (2021-03-27T18:33:00Z) - Jacobian Determinant of Normalizing Flows [7.124391555099448]
Normalizing flows learn a diffeomorphic mapping between the target and base distribution.
Jacobian determinant of that mapping forms another real-valued function.
To stabilize normalizing flows training, it is required to maintain a balance between the expansiveness and contraction of volume.
arXiv Detail & Related papers (2021-02-12T14:09:28Z) - Semi-Supervised Learning with Normalizing Flows [54.376602201489995]
FlowGMM is an end-to-end approach to generative semi supervised learning with normalizing flows.
We show promising results on a wide range of applications, including AG-News and Yahoo Answers text data.
arXiv Detail & Related papers (2019-12-30T17:36:33Z) - Learning Likelihoods with Conditional Normalizing Flows [54.60456010771409]
Conditional normalizing flows (CNFs) are efficient in sampling and inference.
We present a study of CNFs where the base density to output space mapping is conditioned on an input x, to model conditional densities p(y|x)
arXiv Detail & Related papers (2019-11-29T19:17:58Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.