Machine Learning Algorithms

WARNING:

The following course has been moved: Friday, 10 February 2023 => Monday, 6 February 2023. Check the master AI schedule page.

Lecture 1

  • ML recalls

    • Problem types: regression, binary classification, multiclass classification

    • Linear models

    • Regularization

  • Notations and general framework used in the course

  • Optimization problems

  • Binary classification: 0-1 loss and convex surrogates

  • Probabilistic prediction and losses

Lecture 2

  • Convex sets, convex functions

  • Subgradients

  • Indicator function, extended-value extension

  • Optimality conditions (Fermat's theorem)

  • Fenchel conjuguates

Download slides

Download exercises

Lecture 3

  • Lagrangian relaxation

  • KKT optimality conditions

  • Lagrangian duality, Fenchel duality

  • Support Vector Machines and duality

Download slides

If you are interested in SVM and kernels, you should check this book: “Learning with Kernels” (Schölkopf and Smola)

Lecture 4

  • (Sub)gradient descent

  • Proximal method and projected (sub)gradient descent

  • Coordinate descent

Download slides

Lecture 5

  • Another look at the softmax function and the negative log-likelihood loss

  • Regularized prediction functions

  • Fenchel Young Losses