Talks and Presentations
Upcoming and Past Talks
Summary: Presentation of our NeurIPS 2025 work on the Wasserstein convergence of critically damped Langevin diffusions.
Machine Learning at Aussois 2025
Summary: This talk focuses on conditional sampling within the framework of score-based generative models (SGMs). A particular emphasis is placed on sequential Monte Carlo (SMC) techniques, which provide a principled mechanism to perform conditional sampling while preserving the SGM structure. We outline how SMC-based conditional samplers can be embedded in the SGM framework, and conclude with ongoing theoretical work aimed at establishing upper bounds on the resulting generation error.
Summary: We begin with a general introduction to score-based generative models (SGMs) through their stochastic differential equation (SDE) formulation, highlighting the different sources of error that arise in practical implementations. We then discuss what fundamentally defines an SGM, comparing several modelling frameworks such as VP-SDEs, VE-SDEs, flow matching, and damped Langevin diffusions. After motivating the heuristic foundations of this class of algorithms, we illustrate how they are deployed in practice and examine both the opportunities and challenges of kinetic approaches operating in an extended phase space. Finally, we explain why classical theoretical analyses in the Wasserstein-2 distance face structural obstacles and present two possible remedies: (1) a new analysis relying on the Lipschitz regularity of the modified score function, and (2) restoring ellipticity by injecting noise across the whole state space.
Journées de Statistique (JdS) 2025
Summary: We present ongoing work on conditional sampling in score-based generative models (SGMs). The talk discusses two approaches: modifying the training procedure or keeping a score trained on the joint distribution while adapting the sampling process. In particular, we show how Sequential Monte Carlo (SMC) methods can be integrated within the SGM framework to achieve conditional sampling. We conclude by outlining a promising directions toward deriving theoretical upper bounds for conditional sampling with SMC methods.
Summary: Starting from an introduction to the DDPM algorithm, we derive its equivalent SDE formulation as a continuous-time limit. We then discuss several theoretical results related to this SDE framework. Building on this framework, we analyze the role of noise schedules in score-based generative models. The analysis combines theoretical insights with numerical illustrations on both synthetic data and real image datasets.
Summary: Starting from an introduction to the DDPM algorithm, we derive its equivalent SDE formulation as a continuous-time limit. We then discuss several theoretical results related to this SDE framework. Building on this framework, we analyze the role of noise schedules in score-based generative models. The analysis combines theoretical insights with numerical illustrations on both synthetic data and real image datasets.
Journées de Statistique (JdS) 2024
Summary: We present score-based generative models (SGMs) and identify their three main sources of error: mixing, approximation, and discretization. We then derive upper bounds on the sampling error under various metrics and illustrate, through numerical experiments, the impact of the noise schedule on generation quality.
