Fractal Generative Models
- URL: http://arxiv.org/abs/2502.17437v2
- Date: Tue, 25 Feb 2025 14:28:34 GMT
- Title: Fractal Generative Models
- Authors: Tianhong Li, Qinyi Sun, Lijie Fan, Kaiming He,
- Abstract summary: This paper introduces a new level of modularization by abstracting generative models into atomic generative modules.<n>We instantiate our fractal framework using autoregressive models as the atomic generative modules and examine it on the challenging task of pixel-by-pixel image generation.<n>We hope this work could open a new paradigm in generative modeling and provide a fertile ground for future research.
- Score: 24.4628440964565
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Modularization is a cornerstone of computer science, abstracting complex functions into atomic building blocks. In this paper, we introduce a new level of modularization by abstracting generative models into atomic generative modules. Analogous to fractals in mathematics, our method constructs a new type of generative model by recursively invoking atomic generative modules, resulting in self-similar fractal architectures that we call fractal generative models. As a running example, we instantiate our fractal framework using autoregressive models as the atomic generative modules and examine it on the challenging task of pixel-by-pixel image generation, demonstrating strong performance in both likelihood estimation and generation quality. We hope this work could open a new paradigm in generative modeling and provide a fertile ground for future research. Code is available at https://github.com/LTH14/fractalgen.
Related papers
- Generative Modeling of Molecular Dynamics Trajectories [12.255021091552441]
We introduce generative modeling of molecular trajectories as a paradigm for learning flexible multi-task surrogate models of MD from data.
We show such generative models can be adapted to diverse tasks such as forward simulation, transition path sampling, and trajectory upsampling.
arXiv Detail & Related papers (2024-09-26T13:02:28Z) - Combinatorial Complex Score-based Diffusion Modelling through Stochastic Differential Equations [0.0]
This thesis explores the potential of score-based generative models in generating graphs.
In this thesis, we propose a unified framework by employing differential equations.
This innovation overcomes limitations in existing frameworks that focus solely on graph generation, opening up new possibilities in generative AI.
arXiv Detail & Related papers (2024-06-07T13:16:10Z) - Heat Death of Generative Models in Closed-Loop Learning [63.83608300361159]
We study the learning dynamics of generative models that are fed back their own produced content in addition to their original training dataset.
We show that, unless a sufficient amount of external data is introduced at each iteration, any non-trivial temperature leads the model to degenerate.
arXiv Detail & Related papers (2024-04-02T21:51:39Z) - A Phase Transition in Diffusion Models Reveals the Hierarchical Nature of Data [51.03144354630136]
Recent advancements show that diffusion models can generate high-quality images.<n>We study this phenomenon in a hierarchical generative model of data.<n>We find that the backward diffusion process acting after a time $t$ is governed by a phase transition.
arXiv Detail & Related papers (2024-02-26T19:52:33Z) - Benchmarking and Analyzing 3D-aware Image Synthesis with a Modularized
Codebase [30.334079854982843]
We build a well-structured, dubbed Carver, through modularizing the generation process.
The reproduction of a range of cutting-edge algorithms demonstrates the availability of our modularized.
We perform a variety of in-depth analyses, such as the comparison across different types of point feature.
arXiv Detail & Related papers (2023-06-21T17:59:51Z) - MolHF: A Hierarchical Normalizing Flow for Molecular Graph Generation [4.517805235253331]
MolHF is a new hierarchical flow-based model that generates molecular graphs in a coarse-to-fine manner.
MolHF is the first flow-based model that can be applied to model larger molecules (polymer) with more than 100 heavy atoms.
arXiv Detail & Related papers (2023-05-15T08:59:35Z) - DiffGAR: Model-Agnostic Restoration from Generative Artifacts Using
Image-to-Image Diffusion Models [46.46919194633776]
This work aims to develop a plugin post-processing module for diverse generative models.
Unlike traditional degradation patterns, generative artifacts are non-linear and the transformation function is highly complex.
arXiv Detail & Related papers (2022-10-16T16:08:47Z) - InvGAN: Invertible GANs [88.58338626299837]
InvGAN, short for Invertible GAN, successfully embeds real images to the latent space of a high quality generative model.
This allows us to perform image inpainting, merging, and online data augmentation.
arXiv Detail & Related papers (2021-12-08T21:39:00Z) - Learning to Extend Molecular Scaffolds with Structural Motifs [15.78749196233448]
MoLeR is a graph-based model that supports scaffolds as initial seed of the generative procedure.
We show that MoLeR performs comparably to state-of-the-art methods on unconstrained molecular optimization tasks.
We also show the influence of a number of seemingly minor design choices on the overall performance.
arXiv Detail & Related papers (2021-03-05T18:28:49Z) - Learning Neural Generative Dynamics for Molecular Conformation
Generation [89.03173504444415]
We study how to generate molecule conformations (textiti.e., 3D structures) from a molecular graph.
We propose a novel probabilistic framework to generate valid and diverse conformations given a molecular graph.
arXiv Detail & Related papers (2021-02-20T03:17:58Z) - S2RMs: Spatially Structured Recurrent Modules [105.0377129434636]
We take a step towards exploiting dynamic structure that are capable of simultaneously exploiting both modular andtemporal structures.
We find our models to be robust to the number of available views and better capable of generalization to novel tasks without additional training.
arXiv Detail & Related papers (2020-07-13T17:44:30Z) - Reverse Engineering Configurations of Neural Text Generation Models [86.9479386959155]
The study of artifacts that emerge in machine generated text as a result of modeling choices is a nascent research area.
We conduct an extensive suite of diagnostic tests to observe whether modeling choices leave detectable artifacts in the text they generate.
Our key finding, which is backed by a rigorous set of experiments, is that such artifacts are present and that different modeling choices can be inferred by observing the generated text alone.
arXiv Detail & Related papers (2020-04-13T21:02:44Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.