CRM: Centro De Giorgi
logo sns
Optimal Transportation and Applications

Variational Inference via Wasserstein gradient flows

speaker: Philippe Rigollet (MIT (Massachusetts Institute of Technology))

abstract: Bayesian methodology typically generates a high-dimensional posterior distribution $\pi \propto \exp (-V)$ that is known only up to normalizing constants, making the computation even of simple summary statistics such as mean and covariance a major computational hurdle. Along with Monte Carlo Markov Chains ({\sc mcmc}), Variational Inference ({\sc vi}) has emerged as a central computational approach to large-scale Bayesian inference. Rather than sampling from the true posterior $\pi$, {\sc vi} aims at producing a simple but good approximation $\hat \pi$ for which summary statistics are easy to compute; for example, in this presentation, we consider the case where $\hat \pi$ is a Gaussian or a mixture of Gaussians. However, unlike {\sc mcmc} theory, which is well-developed and builds on now-classical probabilistic ideas, {\sc vi} is still poorly understood and dominated by heuristics. In this work, we propose a principled method for {\sc vi} that builds upon the theory of gradient flows. Akin to {\sc mcmc}, it comes with theoretical guarantees when $V$ is strongly convex.


timetable:
Tue 25 Oct, 14:30 - 15:15, Aula Magna Bruno Pontecorvo
<< Go back