MixFlow: Mixture-Conditioned Flow Matching for Out-of-Distribution Generalization
- URL: http://arxiv.org/abs/2601.11827v1
- Date: Fri, 16 Jan 2026 23:13:21 GMT
- Title: MixFlow: Mixture-Conditioned Flow Matching for Out-of-Distribution Generalization
- Authors: Andrea Rubbi, Amir Akbarnejad, Mohammad Vali Sanian, Aryan Yazdan Parast, Hesam Asadollahzadeh, Arian Amani, Naveed Akhtar, Sarah Cooper, Andrew Bassett, Pietro Liò, Lassi Paavolainen, Sattar Vakili, Mo Lotfollahi,
- Abstract summary: We introduce MixFlow, a conditional flow-matching framework for descriptor-controlled generation.<n>MixFlow enables smooth and extrapolation to unseen conditions, leading to substantially improved out-of-distribution generalization.<n>We empirically demonstrate its effectiveness across multiple domains, including prediction of responses to unseen perturbations in single-cell transcriptomic data and high-content microscopy-based drug screening tasks.
- Score: 31.579433319908485
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Achieving robust generalization under distribution shift remains a central challenge in conditional generative modeling, as existing conditional flow-based methods often struggle to extrapolate beyond the training conditions. We introduce MixFlow, a conditional flow-matching framework for descriptor-controlled generation that directly targets this limitation by jointly learning a descriptor-conditioned base distribution and a descriptor-conditioned flow field via shortest-path flow matching. By modeling the base distribution as a learnable, descriptor-dependent mixture, MixFlow enables smooth interpolation and extrapolation to unseen conditions, leading to substantially improved out-of-distribution generalization. We provide analytical insights into the behavior of the proposed framework and empirically demonstrate its effectiveness across multiple domains, including prediction of responses to unseen perturbations in single-cell transcriptomic data and high-content microscopy-based drug screening tasks. Across these diverse settings, MixFlow consistently outperforms standard conditional flow-matching baselines. Overall, MixFlow offers a simple yet powerful approach for achieving robust, generalizable, and controllable generative modeling across heterogeneous domains.
Related papers
- SplineFlow: Flow Matching for Dynamical Systems with B-Spline Interpolants [14.711575625163045]
SplineFlow is a theoretically grounded flow matching algorithm that jointly models conditional paths across observations via B-spline.<n>We show how SplineFlow exploits the smoothness and stability of B-spline bases learn the complex underlying dynamics while ensuring the multi-marginal requirements are met.
arXiv Detail & Related papers (2026-01-30T15:19:48Z) - Contrastive Flow Matching [61.60002028726023]
We introduce Contrastive Flow Matching, an extension to the flow matching objective that explicitly enforces uniqueness across all conditional flows.<n>Our approach adds a contrastive objective that maximizes dissimilarities between predicted flows from arbitrary sample pairs.<n>We find that training models with Contrastive Flow Matching (1) improves training speed by a factor of up to 9x, (2) requires up to 5x fewer de-noising steps and (3) lowers FID by up to 8.9 compared to training the same models with flow matching.
arXiv Detail & Related papers (2025-06-05T17:59:58Z) - Flow Matching Posterior Sampling: A Training-free Conditional Generation for Flow Matching [13.634043135217254]
We propose Flow Matching-based Posterior Sampling (FMPS) to expand its application scope.<n>This correction term can be reformulated to incorporate a surrogate score function.<n>We show that FMPS achieves superior generation quality compared to existing state-of-the-art approaches.
arXiv Detail & Related papers (2024-11-12T08:14:39Z) - Guided Flows for Generative Modeling and Decision Making [55.42634941614435]
We show that Guided Flows significantly improves the sample quality in conditional image generation and zero-shot text synthesis-to-speech.
Notably, we are first to apply flow models for plan generation in the offline reinforcement learning setting ax speedup in compared to diffusion models.
arXiv Detail & Related papers (2023-11-22T15:07:59Z) - Piecewise Normalizing Flows [0.0]
A mismatch between the topology of the target and the base can result in a poor performance.
A number of different works have attempted to modify the topology of the base distribution to better match the target.
We introduce piecewise normalizing flows which divide the target distribution into clusters, with topologies that better match the standard normal base distribution.
arXiv Detail & Related papers (2023-05-04T15:30:10Z) - MixFlows: principled variational inference via mixed flows [15.723555563232498]
MixFlows are a new variational family that consists of a mixture of repeated applications of a map to an initial reference distribution.<n>We show that MixFlows have MCMC-like convergence guarantees when the flow map is ergodic and measure-preserving.<n>We also develop an implementation of MixFlows based on uncorrected discretized Hamiltonian dynamics combined with deterministic momentum refreshment.
arXiv Detail & Related papers (2022-05-16T06:57:57Z) - Bootstrap Your Flow [4.374837991804085]
We develop a new flow-based training procedure, FAB (Flow AIS Bootstrap), to produce accurate approximations to complex target distributions.
We demonstrate that FAB can be used to produce accurate approximations to complex target distributions, including Boltzmann distributions, in problems where previous flow-based methods fail.
arXiv Detail & Related papers (2021-11-22T20:11:47Z) - GFlowNet Foundations [66.69854262276391]
Generative Flow Networks (GFlowNets) have been introduced as a method to sample a diverse set of candidates in an active learning context.
We show a number of additional theoretical properties of GFlowNets.
arXiv Detail & Related papers (2021-11-17T17:59:54Z) - Attentive Contractive Flow with Lipschitz-constrained Self-Attention [25.84621883831624]
We introduce a novel approach called Attentive Contractive Flow (ACF)
ACF utilizes a special category of flow-based generative models - contractive flows.
We demonstrate that ACF can be introduced into a variety of state of the art flow models in a plug-and-play manner.
arXiv Detail & Related papers (2021-09-24T18:02:49Z) - Generative Flows with Invertible Attentions [135.23766216657745]
We introduce two types of invertible attention mechanisms for generative flow models.
We exploit split-based attention mechanisms to learn the attention weights and input representations on every two splits of flow feature maps.
Our method provides invertible attention modules with tractable Jacobian determinants, enabling seamless integration of it at any positions of the flow-based models.
arXiv Detail & Related papers (2021-06-07T20:43:04Z) - Semi-Supervised Learning with Normalizing Flows [54.376602201489995]
FlowGMM is an end-to-end approach to generative semi supervised learning with normalizing flows.
We show promising results on a wide range of applications, including AG-News and Yahoo Answers text data.
arXiv Detail & Related papers (2019-12-30T17:36:33Z) - Learning Likelihoods with Conditional Normalizing Flows [54.60456010771409]
Conditional normalizing flows (CNFs) are efficient in sampling and inference.
We present a study of CNFs where the base density to output space mapping is conditioned on an input x, to model conditional densities p(y|x)
arXiv Detail & Related papers (2019-11-29T19:17:58Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.