Bayes in Grenoble: Guillaume Dehaene

on the May 28, 2019

May, 28th at 2:00 pm
Guillaume Dehaene (EPFL) will talk about "Towards rigorous Variational Bayesian Inference". The event will take place on May, 28 at 2:00 pm at Imag building.
Bayes in Grenoble is a new reading group on Bayesian statistical methods. The purpose of this group is to gather the Grenoble Bayesian community on a monthly basis around noteworthy papers. Those can equally focus on theory, methods, learning, applications, computations, etc, and can be seminal papers as well as recent preprints, as soon as they relate to Bayes.

The sessions last two hours: the presentation is followed by an informal moment where participants will enjoy cocktails and snacks offered by the Grenoble Alpes Data Institute.

The reading group is organised by Julyan Arbel and Florence Forbes. Feel free to contact them if you wish to attend/be added to the mailing list and/or give a talk. https://sites.google.com/view/bigseminar/accueil

React on social media: #BIGseminar

28 May 2019, Guillaume Dehaene (EPFL)

Towards rigorous Variational Bayesian Inference
Abstract

Sampling methods are often considered to be the gold-standard for Bayesian inference because of the attractive promise of their error tending to zero in the limit of infinite computational resources. A much cheaper alternative is provided by modern Variational Inference methods (Blei et al, 2017) or even the Laplace approximation. Empirical tests reveal that such approximations of the posterior are often very accurate but one key limitation is that it is computationally infeasible to give computable guarantees on the approximation quality.

I will present a result aimed at solving this conundrum. I will show that, in the large data limit, the Kullback-Leibler divergence between a probability distribution f(?) and its Laplace approximation g(?) can be accurately approximated as:

KL( g(?), f(?) ) = E_g [ log g(?) - log f(?) ] ? 0.5 Var_g [ log g(?) - log f(?) ]

Critically, this approximation does not require knowledge of the normalization constant of f(?) and is straightforward to estimate by sampling from g(?). This result enables us to accurately measure the size of the error of the Laplace approximation and is critical to ensuring that Variational Inference is not only cheap but rigorous.




Published on May 20, 2019

Practical informations

Lieu(x)


Room 106
IMAG building