skip to content
 

Approximate kernel embeddings of distributions

Presented by: 
Dino Sejdinovic University of Oxford
Date: 
Tuesday 1st May 2018 - 11:00 to 12:00
Venue: 
INI Seminar Room 2
Abstract: 
Kernel embeddings of distributions and the Maximum Mean Discrepancy (MMD), the resulting probability metric, are useful tools for fully nonparametric hypothesis testing and for learning on distributional inputs; i.e., where labels are only observed at an aggregate level. I will give an overview of this framework and describe the use of large-scale approximations to kernel embeddings in the context of Bayesian approaches to learning on distributions and in the context of distributional covariate shift; e.g., where measurement noise on the training inputs differs from that on the testing inputs.



University of Cambridge Research Councils UK
    Clay Mathematics Institute London Mathematical Society NM Rothschild and Sons