9.520/6.7910: Statistical Learning Theory
and Applications

Fall 2019

Course at a Glance

9.520/6.860, Class 01

Instructor: Tomaso Poggio


Description

We introduce and motivate the main theme of much of the course, setting the problem of supervised learning from examples as the ill-posed problem of approximating a multivariate function from sparse data. We present an overview of the theoretical part of the course and sketch the connection between classical Regularization Theory with its RKHS-based algorithms and Learning Theory. We briefly describe several different applications ranging from vision to computer graphics, to finance and neuroscience. The last third of the course will be on data representations for learning and deep learning. It will introduce recent theoretical developments towards a) understanding why deep learning works and b) a new phase in machine learning, beyond classical supervised learning: how to learn in an unsupervised way representations that significantly decrease the sample complexity of a supervised learning.

Slides

Slides for this lecture: PDF.

Video

Lecture video recording: Class 01

Relevant Reading