Caltech CS156 - Machine Learning - Yaser
Yaser Abu-Mostafa



Support
Academic Torrents!

Disable your
ad-blocker!

Yaser-MachineLearning-CS156-Caltech (54 files)
homework/final.pdf177.64kB
homework/final_sol.pdf71.45kB
homework/hw1.pdf152.14kB
homework/hw1_sol.pdf72.12kB
homework/hw2.pdf144.28kB
homework/hw2_sol.pdf72.14kB
homework/hw3.pdf158.14kB
homework/hw3_sol.pdf72.18kB
homework/hw4.pdf174.84kB
homework/hw4_sol.pdf72.14kB
homework/hw5.pdf195.97kB
homework/hw5_sol.pdf72.04kB
homework/hw6.pdf154.47kB
homework/hw6_sol.pdf72.16kB
homework/hw7.pdf191.81kB
homework/hw7_sol.pdf72.14kB
homework/hw8.pdf154.84kB
homework/hw8_sol.pdf72.15kB
lectures/Lecture 01 - The Learning Problem-mbyG85GZ0PI.mp4204.69MB
lectures/Lecture 02 - Is Learning Feasible-MEG35RDD7RA.mp4194.43MB
lectures/Lecture 03 - The Linear Model I-FIbVs5GbBlQ.mp4181.36MB
lectures/Lecture 04 - Error and Noise-L_0efNkdGMc.mp4150.73MB
lectures/Lecture 05 - Training Versus Testing-SEYAnnLazMU.mp4193.68MB
lectures/Lecture 06 - Theory of Generalization-6FWRijsmLtE.mp4179.88MB
lectures/Lecture 07 - The VC Dimension-Dc0sr0kdBVI.mp4168.73MB
lectures/Lecture 08 - Bias-Variance Tradeoff-zrEyxfl2-a8.mp4180.29MB
lectures/Lecture 09 - The Linear Model II-qSTHZvN8hzs.mp4212.67MB
lectures/Lecture 10 - Neural Networks-Ih5Mr93E-2c.mp4181.29MB
lectures/Lecture 11 - Overfitting-EQWr3GGCdzw.mp4197.44MB
lectures/Lecture 12 - Regularization-I-VfYXzC5ro.mp4175.37MB
lectures/Lecture 13 - Validation-o7zzaKd0Lkk.mp4213.24MB
lectures/Lecture 14 - Support Vector Machines-eHsErlPJWUU.mp4177.12MB
lectures/Lecture 15 - Kernel Methods-XUj5JbQihlU.mp4192.07MB
lectures/Lecture 16 - Radial Basis Functions-O8CfrnOPtLc.mp4191.65MB
lectures/Lecture 17 - Three Learning Principles-EZBUDG12Nr0.mp4170.08MB
lectures/Lecture 18 - Epilogue-ihLwJPHkMRY.mp4180.87MB
slides/slides01.pdf306.77kB
slides/slides02.pdf533.86kB
slides/slides03.pdf1.12MB
slides/slides04.pdf1.02MB
slides/slides05.pdf517.01kB
slides/slides06.pdf309.56kB
slides/slides07.pdf426.17kB
slides/slides08.pdf381.68kB
slides/slides09.pdf878.12kB
slides/slides10.pdf408.37kB
slides/slides11.pdf769.88kB
slides/slides12.pdf651.92kB
slides/slides13.pdf764.88kB
slides/slides14.pdf254.77kB
slides/slides15.pdf490.63kB
slides/slides16.pdf519.61kB
slides/slides17.pdf878.50kB
slides/slides18.pdf256.88kB
Type: Course
Tags:

Bibtex:
@article{,
title= {Caltech CS156 - Machine Learning - Yaser},
journal= {},
author= {Yaser Abu-Mostafa},
year= {2012},
url= {http://work.caltech.edu/telecourse.html},
license= {CC BY-NC-ND},
abstract= {##Outline
This is an introductory course in machine learning (ML) that covers the basic theory, algorithms, and applications. ML is a key technology in Big Data, and in many financial, medical, commercial, and scientific applications. It enables computational systems to adaptively improve their performance with experience accumulated from the observed data. ML has become one of the hottest fields of study today, taken up by undergraduate and graduate students from 15 different majors at Caltech. This course balances theory and practice, and covers the mathematical as well as the heuristic aspects. The lectures below follow each other in a story-like fashion:

* What is learning?
* Can a machine learn?
* How to do it?
* How to do it well?
* Take-home lessons.


Lecture 01 - The Learning Problem - Introduction; supervised, unsupervised, and reinforcement learning. Components of the learning problem.

Lecture 02 - Is Learning Feasible? Can we generalize from a limited sample to the entire space? Relationship between in-sample and out-of-sample.

Lecture 03 - The Linear Model I - Linear classification and linear regression. Extending linear models through nonlinear transforms.

Lecture 04 - Error and Noise - The principled choice of error measures. What happens when the target we want to learn is noisy.

Lecture 05 - Training versus Testing - The difference between training and testing in mathematical terms. What makes a learning model able to generalize?

Lecture 06 - Theory of Generalization - How an infinite model can learn from a finite sample. The most important theoretical result in machine learning.

Lecture 07 - The VC Dimension - A measure of what it takes a model to learn. Relationship to the number of parameters and degrees of freedom.

Lecture 08 - Bias-Variance Tradeoff - Breaking down the learning performance into competing quantities. The learning curves.

Lecture 09 - The Linear Model II - More about linear models. Logistic regression, maximum likelihood, and gradient descent.

Lecture 10 - Neural Networks - A biologically inspired model. The efficient backpropagation learning algorithm. Hidden layers.

Lecture 11 - Overfitting - Fitting the data too well; fitting the noise. Deterministic noise versus stochastic noise.

Lecture 12 - Regularization - Putting the brakes on fitting the noise. Hard and soft constraints. Augmented error and weight decay.

Lecture 13 - Validation - Taking a peek out of sample. Model selection and data contamination. Cross validation.

Lecture 14 - Support Vector Machines - One of the most successful learning algorithms; getting a complex model at the price of a simple one.

Lecture 15 - Kernel Methods - Extending SVM to infinite-dimensional spaces using the kernel trick, and to non-separable data using soft margins.

Lecture 16 - Radial Basis Functions - An important learning model that connects several machine learning models and techniques.

Lecture 17 - Three Learning Principles - Major pitfalls for machine learning practitioners; Occam?s razor, sampling bias, and data snooping.

Lecture 18 - Epilogue - The map of machine learning. Brief views of Bayesian learning and aggregation methods.},
keywords= {},
terms= {},
superseded= {}
}