Discrete Flow Matching: Convergence Guarantees Under Minimal Assumptions

arXiv:2605.08882v1 Announce Type: new Abstract: Flow Matching has recently emerged as a popular class of generative models for simulating a target distribution $\mu_1$ from samples drawn from a source distribution $\mu_0$. This framework relies on a fixed coupling between $\mu_0$ and $\mu_1$, and on a deterministic or stochastic bridge to define an interpolating process between the two distributions. The time marginals of this process can then be approximately sampled by estimating the transition rates, or more generally the generator, of its Markovian projection. This framework has recently been extended to the case of discrete source and target distributions, under the name Discrete Flow Matching (DFM). However, theoretical guarantees for such models remain scarce. In this paper, we study two DFM models on $\mathbb{Z}_m^d = \{0,\ldots,m-1\}^d$, sampled through time discretization, and derive non-asymptotic associated bounds for both of them. In contrast to previous work, we establish non-asymptotic bounds in Kullback--Leibler divergence for the early-stopped version of the target distribution. We also derive explicit convergence guarantees in total variation distance with respect to the true target distribution. Importantly, these bounds rely only on an approximation error assumption, relaxing standard score assumptions used in earlier works, while also yielding improved dependence on the vocabulary size $m$ and the dimension $d$.

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top