Regularization Methods for Machine Learning 2016
RegML

regularization_methods_for_machine_learning_2016 (16 files)
Class 1 - Statistical Learning Theory-E1bIqR8Bqr0.mkv 816.82MB
Class 2 - Tikhonov regularization and kernels-xevu1vRdX6w.mp4 576.41MB
Class 3 - Early Stopping and Spectral Regularization-D4C0GfbV4kE.mkv 508.79MB
Class 4 - Regularization for multi-task learning-FX9wIzyhGSA.mp4 392.28MB
Class 5 - Sparsity based regularization-ItfAmoSZRJs.mp4 647.77MB
Class 6 - Structured sparsity-uusYjnAhH98.mkv 659.39MB
Class 7 - Dictionary learning-mqDzeVsiyig.mkv 488.30MB
Class 8 - Deep learning-vRJllrpBao0.mp4 484.25MB
lectures/lec3.pdf 244.75kB
lectures/lec4.pdf 5.91MB
lectures/lec5.pdf 430.89kB
lectures/lec6.pdf 238.18kB
lectures/lec7.pdf 1.36MB
lectures/lec8.pdf 4.60MB
lectures/lect1.pdf 1.80MB
lectures/lect2.pdf 522.21kB
Type: Course
Tags:

Bibtex:
@article{,
title= {Regularization Methods for Machine Learning 2016},
keywords= {},
journal= {},
author= {RegML},
year= {},
url= {http://lcsl.mit.edu/courses/regml/regml2016/},
license= {},
abstract= {Understanding how intelligence works and how it can be emulated in machines is an age old dream and arguably one of the biggest challenges in modern science. Learning, with its principles and computational implementations, is at the very core of this endeavor. Recently, for the first time, we have been able to develop artificial intelligence systems able to solve complex tasks considered out of reach for decades. Modern cameras recognize faces, and smart phones voice commands, cars can see and detect pedestrians and ATM machines automatically read checks. In most cases at the root of these success stories there are machine learning algorithms, that is softwares that are trained rather than programmed to solve a task. Among the variety of approaches to modern computational learning, we focus on regularization techniques, that are key to high- dimensional learning. Regularization methods allow to treat in a unified way a huge class of diverse approaches, while providing tools to design new ones. Starting from classical notions of smoothness, shrinkage and margin, the course will cover state of the art techniques based on the concepts of geometry (aka manifold learning), sparsity and a variety of algorithms for supervised learning, feature selection, structured prediction, multitask learning and model selection. Practical applications for high dimensional problems, in particular in computational vision, will be discussed. The classes will focus on algorithmic and methodological aspects, while trying to give an idea of the underlying theoretical underpinnings. Practical laboratory sessions will give the opportunity to have hands on experience.


RegML is a 20 hours advanced machine learning course including theory classes and practical laboratory sessions. The course covers foundations as well as recent advances in Machine Learning with emphasis on high dimensional data and a core set techniques, namely regularization methods. In many respect the course is compressed version of the 9.520 course at MIT".

| CLASS | DAY      | TIME          | SUBJECT                                                                                              | FILES  |
|-------|----------|---------------|------------------------------------------------------------------------------------------------------|--------|
| 1     | Mon 6/27 | 9:30 - 11:00  | Introduction to Statistical Machine Learning                                                         | Lect_1 |
| 2     | Mon 6/27 | 11:30 - 13:00 | Tikhonov Regularization and Kernels                                                                  | Lect_2 |
| 3     | Mon 6/27 | 14:00 - 16:00 | Laboratory 1: Binary classification and model selection                                              | Lab 1  |
| 4     | Tue 6/28 | 9:30 - 11:00  | Early Stopping and Spectral Regularization                                                           | Lect_3 |
| 5     | Tue 6/28 | 11:30 - 13:00 | Regularization for Multi-task Learning                                                               | Lect_4 |
| 6     | Tue 6/28 | 14:00 - 16:00 | Laboratory 2: Spectral filters and multi-class classification                                        | Lab 2  |
| -     | Wed 6/29 | 9:30 - 10:00  | Workshop: Federico Girosi - Health Analytics and Machine Learning                                    |        |
| -     | Wed 6/29 | 10:00 - 10:30 | Workshop: Massimiliano Pontil - A Class of Regularizers based on Optimal Interpolation               |        |
| -     | Wed 6/29 | 10:30 - 11:00 | Workshop: Gadi Geiger - Visual and Auditory Aspects of Perception in Developmental Dyslexia          |        |
| -     | Wed 6/29 | 11:00 - 11:30 | Coffee Break                                                                                         |        |
| -     | Wed 6/29 | 11:30 - 12:00 | Workshop: Alessandro Verri - Extracting Biomedical Knowledge through Regularized Learning Techniques |        |
| -     | Wed 6/29 | 12:00 - 12:30 | Workshop: Thomas Vetter - Learning the Appearance of Faces: Probabilistic Morphable Models           |        |
| -     | Wed 6/29 | Afternoon     | Free                                                                                                 |        |
| 7     | Thu 6/30 | 9:30 - 11:00  | Sparsity Based Regularization                                                                        | Lect_5 |
| 8     | Thu 6/30 | 11:30 - 13:00 | Structured Sparsity                                                                                  | Lect_6 |
| 9     | Thu 6/30 | 14:00 - 16:00 | Laboratory 3: Sparsity-based learning                                                                | Lab 3  |
| 10    | Fri 7/1  | 9:30 - 11:00  | Data Representation: Dictionary Learning                                                             | Lect_7 |
| 11    | Fri 7/1  | 11:30 - 13:00 | Data Representation: Deep Learning                                                                   | Lect_8 |},
superseded= {},
terms= {}
}


Support
Academic Torrents!

Disable your
ad-blocker!

10 day statistics (7 downloads)

Average Time 2 hours, 07 minutes, 46 seconds
Average Speed 598.60kB/s
Best Time 5 minutes, 26 seconds
Best Speed 14.08MB/s
Worst Time 12 hours, 26 minutes, 01 seconds
Worst Speed 102.52kB/s
Report