Edit Flows: Flow Matching with Edit Operations
- URL: http://arxiv.org/abs/2506.09018v1
- Date: Tue, 10 Jun 2025 17:44:19 GMT
- Title: Edit Flows: Flow Matching with Edit Operations
- Authors: Marton Havasi, Brian Karrer, Itai Gat, Ricky T. Q. Chen,
- Abstract summary: Edit Flows is a non-autoregressive model that defines a discrete flow over sequences through edit operations-insertions, deletions, and substitutions.<n>By modeling these operations within a Continuous-time Markov Chain over the sequence space, Edit Flows enable flexible, position-relative generation.
- Score: 25.751427330260128
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Autoregressive generative models naturally generate variable-length sequences, while non-autoregressive models struggle, often imposing rigid, token-wise structures. We propose Edit Flows, a non-autoregressive model that overcomes these limitations by defining a discrete flow over sequences through edit operations-insertions, deletions, and substitutions. By modeling these operations within a Continuous-time Markov Chain over the sequence space, Edit Flows enable flexible, position-relative generation that aligns more closely with the structure of sequence data. Our training method leverages an expanded state space with auxiliary variables, making the learning process efficient and tractable. Empirical results show that Edit Flows outperforms both autoregressive and mask models on image captioning and significantly outperforms the mask construction in text and code generation.
Related papers
- Auto-Regressive Masked Diffusion Models [9.239507801466322]
Masked diffusion models (MDMs) have emerged as a promising approach for language modeling.<n>They face a performance gap compared to autoregressive models (ARMs) and require more training iterations.<n>We present the Auto-Regressive Masked Diffusion model, which unifies the training efficiency of autoregressive models with the parallel generation capabilities of diffusion-based models.
arXiv Detail & Related papers (2026-01-23T18:42:30Z) - Edit-Based Flow Matching for Temporal Point Processes [51.33476564706644]
temporal point processes (TPPs) are a fundamental tool for modeling event sequences in continuous time.<n>Recent non-autoregressive, diffusion-style models mitigate these issues by jointly interpolating between noise and data.<n>We introduce an Edit Flow process for TPPs that transports noise to data via insert, delete, and substitute edit operations.
arXiv Detail & Related papers (2025-10-07T15:44:12Z) - COrAL: Order-Agnostic Language Modeling for Efficient Iterative Refinement [80.18490952057125]
Iterative refinement has emerged as an effective paradigm for enhancing the capabilities of large language models (LLMs) on complex tasks.
We propose Context-Wise Order-Agnostic Language Modeling (COrAL) to overcome these challenges.
Our approach models multiple token dependencies within manageable context windows, enabling the model to perform iterative refinement internally.
arXiv Detail & Related papers (2024-10-12T23:56:19Z) - CaLMFlow: Volterra Flow Matching using Causal Language Models [14.035963716966787]
CaLMFlow is a framework that casts flow matching as a Volterra integral equation (VIE)
Our method implements tokenization across space and time, thereby solving a VIE over these domains.
We demonstrate CaLMFlow's effectiveness on synthetic and real-world data, including single-cell perturbation response prediction.
arXiv Detail & Related papers (2024-10-03T05:07:41Z) - Discrete Diffusion Language Model for Efficient Text Summarization [19.267738861590487]
We introduce a novel semantic-aware noising process that enables Transformer backbones to handle long sequences effectively.<n>Our approaches achieve state-of-the-art performance on three benchmark summarization datasets: Gigaword, CNN/DailyMail, and Arxiv.
arXiv Detail & Related papers (2024-06-25T09:55:22Z) - Non-autoregressive Sequence-to-Sequence Vision-Language Models [59.445765313094434]
We propose a parallel decoding sequence-to-sequence vision-language model that marginalizes over multiple inference paths in the decoder.<n>The model achieves performance on-par with its state-of-the-art autoregressive counterpart, but is faster at inference time.
arXiv Detail & Related papers (2024-03-04T17:34:59Z) - PLANNER: Generating Diversified Paragraph via Latent Language Diffusion Model [37.2192243883707]
We propose PLANNER, a model that combines latent semantic diffusion with autoregressive generation to generate fluent text.
Results on semantic generation, text completion and summarization show its effectiveness in generating high-quality long-form text.
arXiv Detail & Related papers (2023-06-05T01:36:39Z) - DiffusER: Discrete Diffusion via Edit-based Reconstruction [88.62707047517914]
DiffusER is an edit-based generative model for text based on denoising diffusion models.
It can rival autoregressive models on several tasks spanning machine translation, summarization, and style transfer.
It can also perform other varieties of generation that standard autoregressive models are not well-suited for.
arXiv Detail & Related papers (2022-10-30T16:55:23Z) - Text Editing as Imitation Game [33.418628166176234]
We reformulate text editing as an imitation game using behavioral cloning.
We introduce a dual decoders structure to parallel the decoding while retaining the dependencies between action tokens.
Our model consistently outperforms the autoregressive baselines in terms of performance, efficiency, and robustness.
arXiv Detail & Related papers (2022-10-21T22:07:04Z) - Diffusion-LM Improves Controllable Text Generation [80.50044830018442]
Controlling the behavior of language models (LMs) without re-training is a major open problem in natural language generation.
We develop a new non-autoregressive language model based on continuous diffusions that we call Diffusion-LM.
We demonstrate successful control of Diffusion-LM for six challenging fine-grained control tasks, significantly outperforming prior work.
arXiv Detail & Related papers (2022-05-27T20:12:09Z) - Learning to Model Editing Processes [98.11448946134894]
We propose modeling editing processes, modeling the whole process of iteratively generating sequences.
We form a conceptual framework to describe the likelihood of multi-step edits, and describe neural models that can learn a generative model of sequences based on these multistep edits.
arXiv Detail & Related papers (2022-05-24T21:32:52Z) - Improving Sequential Latent Variable Models with Autoregressive Flows [30.053464816814348]
We propose an approach for improving sequence modeling based on autoregressive normalizing flows.
Results are presented on three benchmark video datasets, where autoregressive flow-based dynamics improve log-likelihood performance.
arXiv Detail & Related papers (2020-10-07T05:14:37Z) - Cascaded Text Generation with Markov Transformers [122.76100449018061]
Two dominant approaches to neural text generation are fully autoregressive models, using serial beam search decoding, and non-autoregressive models, using parallel decoding with no output dependencies.
This work proposes an autoregressive model with sub-linear parallel time generation. Noting that conditional random fields with bounded context can be decoded in parallel, we propose an efficient cascaded decoding approach for generating high-quality output.
This approach requires only a small modification from standard autoregressive training, while showing competitive accuracy/speed tradeoff compared to existing methods on five machine translation datasets.
arXiv Detail & Related papers (2020-06-01T17:52:15Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.