top of page

## Introduction to Learning: Theory and Algorithms

APM-0EL05-TP

###### Course 1: Why ML? ML and the broader landscape, ML vs. AI (UML Chapters 1,2): info, notations, HW1, HW1sol

Course 2: PAC/APAC model of learning, supervised and unsupervised learning as special cases, ERM, No Free Lunch Theorem (UML Chapter 3): HW2, HW2sol

Course 3: Learning through uniform convergence, shattering, VC dimension (UML Chapters 4,5,6): HW3, HW3sol

Course 4: What can/cannot be learned, statistical vs. computational complexity of learning (UML Chapters 6, 7, 8 lightly): HW4

Course 5: Linear classifiers, perceptron, SVM, sufficiency of finite VC dimension (UML Chapters 9, 9.1, 15): HW5- HW5

Course 6: Linear regression, logistic regression (UML Chapter 9, see here for sample complexity of linear regression): HW5 Pycode

Course 7: Model selection/validation, K-NN, K-Means (UML Chapters 11, 19, 22): HW6

Course 8 (MICAS only but everybody is welcome!): Boosting, PCA (UML Chapters 10, 23): HW7

Course 9 (MICAS only): Revision, end of exercises

Exam

bottom of page