Abstract
Connecting optimal transport and variational inference, we present a principled and systematic framework for sampling and generative modelling centred around divergences on path space. Our work culminates in the development of Controlled Monte Carlo Diffusions for sampling and inference, a score-based annealing tech- nique that crucially adapts both forward and backward dynamics in a diffusion model. On the way, we clarify the relationship between the EM-algorithm and iterative proportional fitting (IPF) for Schro ̈dinger bridges, providing a conceptual link between fields. Finally, we show that CMCD has a strong foundation in the Jarzinsky and Crooks identities from statistical physics, and that it convincingly outperforms competing approaches across a wide array of experiments.
Original language | English |
---|---|
Title of host publication | The Twelfth International Conference on Learning Representations |
Subtitle of host publication | ICLR 2024 |
Publication status | Published - 7 May 2024 |