skip to content
 

Statistical scalability

Participation in INI programmes is by invitation only. Anyone wishing to apply to participate in the associated workshop(s) should use the relevant workshop application form.

Big data
Programme
10th January 2018 to 29th June 2018

 

OrganisersJohn Aston (Cambridge),  Idris Eckley (Lancaster) , Paul Fearnhead (Lancaster) , Po-Ling Loh (Wisconsin-Madison), Rob Nowak (Wisconsin-Madison) , Richard Samworth (Cambridge) 

 

Programme Theme

We are living in the information age.  Modern technology is transforming our ability to collect and store data on unprecedented scales.  From the use of Oyster card data to improve London's transport network, to the Square Kilometre Array astrophysics project that has the potential to transform our understanding of the universe, `Big Data' can inform and enrich many aspects of our lives.  Given the prospects of transformational advances to standard practice in a plethora of data-rich industries, government agencies, science and technology, it is unsurprising that Big Data is currently receiving such a high level of media publicity.

Of course, the important role of statistics within Big Data has been clear for some time.  However the current tendency has been to focus purely on algorithmic scalability, such as how to develop versions of existing statistical algorithms that scale better with the amount of data.  Such an approach, however, ignores the fact that fundamentally new issues often arise, and highly innovative solutions are required.  In particular, the thesis of this programme is that it is only by simultaneous consideration of the methodological, theoretical and computational challenges involved that we can hope to provide robust, scalable methods that are crucial to unlocking the potential of Big Data.

University of Cambridge Research Councils UK
    Clay Mathematics Institute London Mathematical Society NM Rothschild and Sons