skip to content

Stein Points: Efficient sampling from posterior distributions by minimising Stein Discrepancies.

Presented by: 
Francois-Xavier Briol Imperial College London, University of Warwick, University of Oxford
Wednesday 23rd May 2018 - 11:00 to 13:00
INI Seminar Room 2
An important task in computational statistics and machine learning is to approximate a posterior distribution with an empirical measure supported on a set of representative points. This work focuses on methods where the selection of points is essentially deterministic, with an emphasis on achieving accurate approximation when the number of points is small. To this end, we present `Stein Points'. The idea is to exploit either a greedy or a conditional gradient method to iteratively minimise a kernel Stein discrepancy between the empirical measure and the target measure. Our empirical results demonstrate that Stein Points enable accurate approximation of the posterior at modest computational cost. In addition, theoretical results are provided to establish convergence of the method.
University of Cambridge Research Councils UK
    Clay Mathematics Institute London Mathematical Society NM Rothschild and Sons