Center for Brains, Minds and Machines (CBMM)

Support

Academic Torrents!

Disable your

ad-blocker!

@article{, title= {MIT Course 9.520 - Statistical Learning Theory and Applications, Fall 2015}, keywords= {}, journal= {}, author= {Center for Brains, Minds and Machines (CBMM)}, year= {}, url= {http://www.mit.edu/~9.520/fall15/}, license= {}, abstract= {Course description The class covers foundations and recent advances of Machine Learning from the point of view of Statistical Learning Theory. Understanding intelligence and how to replicate it in machines is arguably one of the greatest problems in science. Learning, its principles and computational implementations, is at the very core of intelligence. During the last decade, for the first time, we have been able to develop artificial intelligence systems that can solve complex tasks considered out of reach. ATM machines read checks, cameras recognize faces, smart phones understand your voice and cars can see and avoid obstacles. The machine learning algorithms that are at the roots of these success stories are trained with labeled examples rather than programmed to solve a task. Among the approaches in modern machine learning, the course focuses on regularization techniques, that provide a theoretical foundation to high- dimensional supervised learning. Besides classic approaches such as Support Vector Machines, the course covers state of the art techniques exploiting data geometry (aka manifold learning), sparsity and a variety of algorithms for supervised learning (batch and online), feature selection, structured prediction and multitask learning. Concepts from optimization theory useful for machine learning are covered in some detail (first order methods, proximal/splitting techniques...). The final part of the course will focus on deep learning networks. It will introduce a theoretical framework connecting the computations within the layers of deep learning networks to kernel machines. It will study an extension of the convolutional layers in order to deal with more general invariance properties and to learn them from implicitly supervised data. This theory of hierarchical architectures may explain how visual cortex learn, in an implicitly supervised way, data representation that can lower the sample complexity of a final supervised learning stage. The goal of this class is to provide students with the theoretical knowledge and the basic intuitions needed to use and develop effective machine learning solutions to challenging problems. Prerequisites We will make extensive use of linear algebra, basic functional analysis (we cover the essentials in class and during the math-camp), basic concepts in probability theory and concentration of measure (also covered in class and during the mathcamp). Students are expected to be familiar with MATLAB. | Class | Date | Title | Instructor(s) | |---------------------------|------------|--------------------------------------------------------------------|---------------| | Class 01 | Wed Sep 09 | The Course at a Glance | TP | | Class 02 | Mon Sep 14 | The Learning Problem and Regularization | TP | | Class 03 | Wed Sep 16 | Math Camp | CF/CC | | Class 04 | Mon Sep 21 | Reproducing Kernel Hilbert Spaces | LR | | Class 05 | Wed Sep 23 | Dictionaries, Feature Maps and Mercer Theorem | LR | | Class 06 | Mon Sep 28 | Tikhonov Regularization and the Representer Theorem | LR | | Class 07 | Wed Sep 30 | Logistic Regression and Support Vector Machines | LR | | Class 08 | Mon Oct 05 | Regularized Least Squares | LR | | Class 09 | Wed Oct 07 | Iterative Regularization via Early Stopping | LR | | Mon Oct 12 - Columbus Day | | | | | Class 10 | Tue Oct 13 | Sparsity Based Regularization | LR | | Class 11 | Wed Oct 14 | Proximal Methods | LR | | Class 12 | Mon Oct 19 | Structured Sparsity Regularization | LR | | Class 13 | Wed Oct 21 | Multiple Kernel Learning | LR | | Class 14 | Mon Oct 26 | Generalization Bounds, Intro to Stability | CF/TP | | Class 15 | Wed Oct 28 | Stability of Tikhonov Regularization | CF/TP | | Class 16 | Mon Nov 02 | Consistency, Learnability and Regularization | LR | | Class 17 | Wed Nov 04 | On-line Learning | LR | | Class 18 | Mon Nov 09 | Manifold Regularization | LR | | Wed Nov 11 - Veterans Day | | | | | Class 19 | Mon Nov 16 | Regularization for Multi-Output Learning I | LR | | Class 20 | Wed Nov 18 | Regularization for Multi-Output Learning II | CC | | Class 21 | Mon Nov 23 | Learning Data Representation: from Fourier to Compressed Sensing | LR | | Class 22 | Wed Nov 25 | Learning Data Representation: Autoencoders and Dictionary Learning | LR | | Class 23 | Mon Nov 30 | Learning Data Representation: Deep Neural Networks (DNNs) | LR | | Class 24 | Wed Dec 02 | Learning Data Representation: Deep Theory I | TP | | Class 25 | Mon Dec 07 | Learning Data Representation: DNN Tips and Tricks | Gemma Roig | | Class 26 | Wed Dec 09 | Learning Data Representation: Deep Theory II | TP | }, superseded= {}, terms= {} }