Fractal Flow: Hierarchical and Interpretable Normalizing Flow via Topic Modeling and Recursive Strategy
- URL: http://arxiv.org/abs/2508.19750v1
- Date: Wed, 27 Aug 2025 10:25:15 GMT
- Title: Fractal Flow: Hierarchical and Interpretable Normalizing Flow via Topic Modeling and Recursive Strategy
- Authors: Binhui Zhang, Jianwei Ma,
- Abstract summary: We propose Fractal Flow, a novel normalizing flow architecture that enhances both expressiveness and interpretability.<n> Experiments on MNIST, FashionMNIST, CIFAR-10, and geophysical data demonstrate that the Fractal Flow achieves latent clustering, controllable generation, and superior estimation accuracy.
- Score: 3.648417123399582
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Normalizing Flows provide a principled framework for high-dimensional density estimation and generative modeling by constructing invertible transformations with tractable Jacobian determinants. We propose Fractal Flow, a novel normalizing flow architecture that enhances both expressiveness and interpretability through two key innovations. First, we integrate Kolmogorov-Arnold Networks and incorporate Latent Dirichlet Allocation into normalizing flows to construct a structured, interpretable latent space and model hierarchical semantic clusters. Second, inspired by Fractal Generative Models, we introduce a recursive modular design into normalizing flows to improve transformation interpretability and estimation accuracy. Experiments on MNIST, FashionMNIST, CIFAR-10, and geophysical data demonstrate that the Fractal Flow achieves latent clustering, controllable generation, and superior estimation accuracy.
Related papers
- Preparation of Fractal-Inspired Computational Architectures for Advanced Large Language Model Analysis [50.11146543029802]
It introduces FractalNet, a fractal-inspired computational architectures for advanced large language model analysis.<n>The new set-up involves a template-driven generator, runner, and evaluation framework that, through systematic permutations of convolutional, normalization, activation, and dropout layers, can create more than 1,200 variants of neural networks.<n>The paper positions fractal design as a feasible and resource-efficient method of automated architecture exploration.
arXiv Detail & Related papers (2025-11-10T17:31:39Z) - HFNO: an interpretable data-driven decomposition strategy for turbulent flows [0.0]
We present a novel FNO-based architecture tailored for reduced-order modeling of turbulent fluid flows.<n>The proposed architecture processes wavenumber bins in parallel, enabling approximation of dispersion relations and non-linear interactions.<n>We evaluate the proposed model on a series of increasingly complex dynamical systems.
arXiv Detail & Related papers (2025-11-03T12:57:19Z) - Kernelised Normalising Flows [10.31916245015817]
Normalising Flows are non-parametric statistical models characterised by their dual capabilities of density estimation and generation.
We present Ferumal flow, a novel kernelised normalising flow paradigm that integrates kernels into the framework.
arXiv Detail & Related papers (2023-07-27T13:18:52Z) - Machine Learning model for gas-liquid interface reconstruction in CFD
numerical simulations [59.84561168501493]
The volume of fluid (VoF) method is widely used in multi-phase flow simulations to track and locate the interface between two immiscible fluids.
A major bottleneck of the VoF method is the interface reconstruction step due to its high computational cost and low accuracy on unstructured grids.
We propose a machine learning enhanced VoF method based on Graph Neural Networks (GNN) to accelerate the interface reconstruction on general unstructured meshes.
arXiv Detail & Related papers (2022-07-12T17:07:46Z) - Bayesian Structure Learning with Generative Flow Networks [85.84396514570373]
In Bayesian structure learning, we are interested in inferring a distribution over the directed acyclic graph (DAG) from data.
Recently, a class of probabilistic models, called Generative Flow Networks (GFlowNets), have been introduced as a general framework for generative modeling.
We show that our approach, called DAG-GFlowNet, provides an accurate approximation of the posterior over DAGs.
arXiv Detail & Related papers (2022-02-28T15:53:10Z) - Equivariant Discrete Normalizing Flows [10.867162810786361]
We focus on building equivariant normalizing flows using discrete layers.
We introduce two new equivariant flows: $G$-coupling Flows and $G$-Residual Flows.
Our construction of $G$-Residual Flows are also universal, in the sense that we prove an $G$-equivariant diffeomorphism can be exactly mapped by a $G$-residual flow.
arXiv Detail & Related papers (2021-10-16T20:16:00Z) - Moser Flow: Divergence-based Generative Modeling on Manifolds [49.04974733536027]
Moser Flow (MF) is a new class of generative models within the family of continuous normalizing flows (CNF)
MF does not require invoking or backpropagating through an ODE solver during training.
We demonstrate for the first time the use of flow models for sampling from general curved surfaces.
arXiv Detail & Related papers (2021-08-18T09:00:24Z) - Generative Flows with Invertible Attentions [135.23766216657745]
We introduce two types of invertible attention mechanisms for generative flow models.
We exploit split-based attention mechanisms to learn the attention weights and input representations on every two splits of flow feature maps.
Our method provides invertible attention modules with tractable Jacobian determinants, enabling seamless integration of it at any positions of the flow-based models.
arXiv Detail & Related papers (2021-06-07T20:43:04Z) - SurVAE Flows: Surjections to Bridge the Gap between VAEs and Flows [78.77808270452974]
SurVAE Flows is a modular framework for composable transformations that encompasses VAEs and normalizing flows.
We show that several recently proposed methods, including dequantization and augmented normalizing flows, can be expressed as SurVAE Flows.
arXiv Detail & Related papers (2020-07-06T13:13:22Z) - Graphical Normalizing Flows [11.23030807455021]
Normalizing flows model complex probability distributions by combining a base distribution with a series of neural networks.
State-of-the-art architectures rely on coupling and autoregressive transformations to lift up invertible functions from scalars to vectors.
We propose the graphical normalizing flow, a new invertible transformation with either a prescribed or a learnable graphical structure.
arXiv Detail & Related papers (2020-06-03T21:50:29Z) - Augmented Normalizing Flows: Bridging the Gap Between Generative Flows
and Latent Variable Models [11.206144910991481]
We propose a new family of generative flows on an augmented data space, with an aim to improve expressivity without drastically increasing the computational cost of sampling and evaluation of a lower bound on the likelihood.
We demonstrate state-of-the-art performance on standard benchmarks of flow-based generative modeling.
arXiv Detail & Related papers (2020-02-17T17:45:48Z) - Semi-Supervised Learning with Normalizing Flows [54.376602201489995]
FlowGMM is an end-to-end approach to generative semi supervised learning with normalizing flows.
We show promising results on a wide range of applications, including AG-News and Yahoo Answers text data.
arXiv Detail & Related papers (2019-12-30T17:36:33Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.