skip to content

Partial least squares for dependent data

Presented by: 
Tatyana Krivobokova Georg-August-Universität Göttingen
Thursday 10th May 2018 - 11:00 to 12:00
INI Seminar Room 2
We consider the linear and kernel partial least squares algorithms for dependent data and study the consequences of ignoring the dependence both theoretically and numerically. For linear partial least squares estimator we derive convergence rates and show that ignoring non-stationary dependence structures can lead to inconsistent estimation. For kernel partial least squares estimator we establish convergence rates under a source and an effective dimensionality conditions. It is shown both theoretically and in simulations that long range dependence results in slower convergence rates. A protein dynamics example illustrates our results and shows high predictive power of partial least squares.
This is joint work with Marco Singer, Axel Munk and Bert de Groot.

University of Cambridge Research Councils UK
    Clay Mathematics Institute London Mathematical Society NM Rothschild and Sons