Flow Plugin Network for conditional generation
- URL: http://arxiv.org/abs/2110.04081v1
- Date: Thu, 7 Oct 2021 17:26:57 GMT
- Title: Flow Plugin Network for conditional generation
- Authors: Patryk Wielopolski, Micha{\l} Koperski, Maciej Zi\k{e}ba
- Abstract summary: by default, we cannot control its sampling process, i.e., we cannot generate a sample with a specific set of attributes.
We propose a novel approach that enables to a generation of objects with a given set of attributes without retraining the base model.
- Score: 1.123376893295777
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Generative models have gained many researchers' attention in the last years
resulting in models such as StyleGAN for human face generation or PointFlow for
the 3D point cloud generation. However, by default, we cannot control its
sampling process, i.e., we cannot generate a sample with a specific set of
attributes. The current approach is model retraining with additional inputs and
different architecture, which requires time and computational resources. We
propose a novel approach that enables to a generation of objects with a given
set of attributes without retraining the base model. For this purpose, we
utilize the normalizing flow models - Conditional Masked Autoregressive Flow
and Conditional Real NVP, as a Flow Plugin Network (FPN).
Related papers
- Derivative-Free Guidance in Continuous and Discrete Diffusion Models with Soft Value-Based Decoding [84.3224556294803]
Diffusion models excel at capturing the natural design spaces of images, molecules, DNA, RNA, and protein sequences.
We aim to optimize downstream reward functions while preserving the naturalness of these design spaces.
Our algorithm integrates soft value functions, which looks ahead to how intermediate noisy states lead to high rewards in the future.
arXiv Detail & Related papers (2024-08-15T16:47:59Z) - Mixed Continuous and Categorical Flow Matching for 3D De Novo Molecule Generation [0.0]
Flow matching is a recently proposed generative modeling framework that generalizes diffusion models.
We extend the flow matching framework to categorical data by constructing flows that are constrained to exist on a continuous representation of categorical data known as the probability simplex.
We find that, in practice, a simpler approach that makes no accommodations for the categorical nature of the data yields equivalent or superior performance.
arXiv Detail & Related papers (2024-04-30T17:37:21Z) - Guided Flows for Generative Modeling and Decision Making [55.42634941614435]
We show that Guided Flows significantly improves the sample quality in conditional image generation and zero-shot text synthesis-to-speech.
Notably, we are first to apply flow models for plan generation in the offline reinforcement learning setting ax speedup in compared to diffusion models.
arXiv Detail & Related papers (2023-11-22T15:07:59Z) - StarNet: Style-Aware 3D Point Cloud Generation [82.30389817015877]
StarNet is able to reconstruct and generate high-fidelity and even 3D point clouds using a mapping network.
Our framework achieves comparable state-of-the-art performance on various metrics in the point cloud reconstruction and generation tasks.
arXiv Detail & Related papers (2023-03-28T08:21:44Z) - Training and Tuning Generative Neural Radiance Fields for Attribute-Conditional 3D-Aware Face Generation [66.21121745446345]
We propose a conditional GNeRF model that integrates specific attribute labels as input, thus amplifying the controllability and disentanglement capabilities of 3D-aware generative models.
Our approach builds upon a pre-trained 3D-aware face model, and we introduce a Training as Init and fidelity for Tuning (TRIOT) method to train a conditional normalized flow module.
Our experiments substantiate the efficacy of our model, showcasing its ability to generate high-quality edits with enhanced view consistency.
arXiv Detail & Related papers (2022-08-26T10:05:39Z) - Score-Based Generative Models for Molecule Generation [0.8808021343665321]
We train a Transformer-based score function on representations of 1.5 million samples from the ZINC dataset.
We use the Moses benchmarking framework to evaluate the generated samples on a suite of metrics.
arXiv Detail & Related papers (2022-03-07T13:46:02Z) - PluGeN: Multi-Label Conditional Generation From Pre-Trained Models [1.4777718769290524]
PluGeN is a simple yet effective generative technique that can be used as a plugin to pre-trained generative models.
We show that PluGeN preserves the quality of backbone models while adding the ability to control the values of labeled attributes.
arXiv Detail & Related papers (2021-09-18T21:02:24Z) - Go with the Flows: Mixtures of Normalizing Flows for Point Cloud
Generation and Reconstruction [98.38585659305325]
normalizing flows (NFs) have demonstrated state-of-the-art performance on modeling 3D point clouds.
This work enhances their representational power by applying mixtures of NFs to point clouds.
arXiv Detail & Related papers (2021-06-06T14:25:45Z) - Discrete Point Flow Networks for Efficient Point Cloud Generation [36.03093265136374]
Generative models have proven effective at modeling 3D shapes and their statistical variations.
We introduce a latent variable model that builds on normalizing flows to generate 3D point clouds of an arbitrary size.
For single-view shape reconstruction we also obtain results on par with state-of-the-art voxel, point cloud, and mesh-based methods.
arXiv Detail & Related papers (2020-07-20T14:48:00Z) - Dynamic Model Pruning with Feedback [64.019079257231]
We propose a novel model compression method that generates a sparse trained model without additional overhead.
We evaluate our method on CIFAR-10 and ImageNet, and show that the obtained sparse models can reach the state-of-the-art performance of dense models.
arXiv Detail & Related papers (2020-06-12T15:07:08Z) - Closing the Dequantization Gap: PixelCNN as a Single-Layer Flow [16.41460104376002]
We introduce subset flows, a class of flows that can transform finite volumes and allow exact computation of likelihoods for discrete data.
We identify ordinal discrete autoregressive models, including WaveNets, PixelCNNs and Transformers, as single-layer flows.
We demonstrate state-of-the-art results on CIFAR-10 for flow models trained with dequantization.
arXiv Detail & Related papers (2020-02-06T22:58:51Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.