Lectures

Tuesday, January 19 Course outline. Probabilistic formulations of prediction problems Download 01-notes.pdf
Thursday, January 21 Plug-in estimators. Empirical risk minimization. Linear threshold functions. Perceptron algorithm Download 02-notes.pdf
Tuesday, January 26 Minimax risk bounds. Uniform convergence. Concentration Download 03-notes.pdf
Thursday, January 28 Hoeffding, Bernstein inequalities. Bounded differences inequality. Download 04-notes.pdf
Tuesday, February 2 Uniform laws of large numbers. Glivenko-Cantelli theorem. Rademacher complexity. Download 05-notes.pdf
Thursday, February 4 Growth function. Vapnik-Chervonenkis dimension. Sauer's Lemma. Download 06-notes.pdf
Tuesday, February 9 VC-dimension of neural networks. Download 07-notes.pdf
Thursday, February 11 Covering numbers, packing numbers. Download 08-notes.pdf
Tuesday, February 16 Chaining. Dudley's entropy integral. Sudakov's Theorem. Download 09-notes.pdf
Thursday, February 18 Pollard's pseudodimension. Convex losses for classification. Download 10-notes.pdf
Tuesday, February 23 Kernel methods. Reproducing kernel Hilbert spaces. Mercer's theorem. Download 11-notes.pdf
Thursday, February 25 Support vector machines. Convex optimization. Download 12-notes.pdf
Tuesday, March 1 Soft margin SVMs. Representer theorem. Download 13-notes.pdf
Thursday, March 3 Risk bounds for SVMs: Rademacher averages. Kernel ridge regression. Gaussian process regression. Download 14-notes.pdf
Tuesday, March 8 Online learning. Prediction with expert advice. Exponential weights. Download 15-notes.pdf
Thursday, March 10 Exponential weights as Bayesian prediction. Online to batch. Optimal regret. Download 16-notes.pdf
Tuesday, March 15 The boosting problem. AdaBoost as greedy variational algorithm. Weak learning, strong learning & large margin. Download 17-notes.pdf
Thursday, March 17 AdaBoost: other losses, multi-class, dual, iterative projection, and convergence. Download 18-notes.pdf
Tuesday, March 22 Spring
Thursday, March 24 Break
Tuesday, March 29 Model selection. Universal consistency of AdaBoost. Optimal regret and the dual game. Download 19-notes.pdf
Thursday, March 31 Sequential Rademacher averages. Download 20-notes.pdf
Tuesday, April 5 Optimal regret for linear games. Online convex optimization. Download 21-notes.pdf
Thursday, April 7 Online convex optimization: gradient method; regularized minimization; Bregman divergence. Download 22-notes.pdf
Tuesday, April 12 Online convex optimization: mirror descent. Download 23-notes.pdf
Thursday, April 14 Online convex optimization: regret bounds for mirror descent. Download 24-notes.pdf
Tuesday, April 19 Logarithmic regret with strongly convex losses. Download 25-notes.pdf
Thursday, April 21 Adaptive online optimization: adapting to strong convexity. Adagrad. Download 26-notes.pdf
Tuesday, April 26 Final project presentations
Thursday, April 28 Final project presentations