Bayes in Grenoble: Alisa Kirichenko

on the April 26, 2018

April, 26th at 3:00 pm
Alisa Kirichenko (Postdoc, Machine Learning dept. of CWI, Amsterdam) will talk about "Function estimation on large graphs using Bayesian Laplacian regularization". The event will take place on April, 26 at 3:00 pm at Imag building.
Bayes in Grenoble is a new reading group on Bayesian statistical methods. The purpose of this group is to gather the Grenoble Bayesian community on a monthly basis around noteworthy papers. Those can equally focus on theory, methods, learning, applications, computations, etc, and can be seminal papers as well as recent preprints, as soon as they relate to Bayes.

The sessions last two hours: the presentation is followed by an informal moment where participants will enjoy cocktails and snacks offered by the Grenoble Alpes Data Institute.

The reading group is organised by Julyan Arbel and Florence Forbes. Feel free to contact them if you wish to attend/be added to the mailing list and/or give a talk.

React on social media: #BIGseminar

26 April 2018, Alisa Kirichenko (Postdoc, Machine Learning dept. of CWI, Amsterdam)

Function estimation on large graphs using Bayesian Laplacian regularization

In recent years there has been substantial interest in high-dimensional estimation and prediction problems on large graphs. These can in many cases be viewed as high-dimensional or nonparametric regression or classification problems in which the goal is to learn a “smooth” function on a given graph. We present a mathematical framework that allows to study the performance of nonparametric function estimation methods on large graphs and we derive minimax convergence rates within the framework. We consider simple undirected graphs that satisfy an assumption on their “asymptotic geometry”, formulated in terms of the graph Laplacian. We also introduce a Sobolev-type smoothness condition on the target function using the graph Laplacian to quantify smoothness. Then we develop Bayesian procedures for problems at hand and we show how asymptotically optimal Bayesian regularization can be achieved under these conditions. The priors we study are randomly scaled Gaussians with precision operators involving the Laplacian of the graph.

Published on September 18, 2018

Practical informations


Imag building, amphi
700 Avenue Centrale, Saint-Martin-d'Hères