Previous exams:
2021-2022
Warning: no document allowed at the exam!
Convex analysis:
Convex Optimization (Boyd and Vandenberghe)
First-order optimization methods (Beck)
Probabilities:
Tutorial of Wilker Aziz and Philip Schulz
Probability & Statistics with Applications to Computing (Alex Tsun)
Lecture Notes for Statistics (John Duchi) — version “March 7, 2019”
Probabilistic graphical models: principles and techniques (Koller and Friedman)
Globally normalized models / energy networks:
On globally normalized models, check section 5 “Energy-Based Models and Boltzmann Machines” (p. 26) in Learning Deep Architectures for AI (Bengio)
Tutorial on energy networks (warning: this is on energy networks in general, not specifically on generative models!): A Tutorial on Energy-Based Learning (LeCun et al.)
If you want to learn more about MCMC, you can check the following technical report: Probabilistic inference using MCMC methods (Neal)
Normalizing flows:
Normalizing Flows: An Introduction and Review of Current Methods (Kobyzev et al.)
Normalizing Flows for Probabilistic Modeling and Inference (Papamakarios et al.)
NICE: Non-linear Independent Components Estimation (Dinh et al.)
Density estimation using Real NVP (Dinh et al.)
Generative Adversarial Networks:
Generative Adversarial Networks (Goodfellow et al.)
NeurIPS 2016 Tutorial: Generative Adversarial Networks (Goodfellow)
f-GAN: Training Generative Neural Samplers using Variational Divergence Minimization (Nowozin et al.)
Others:
Very interesting paper on the EM algorithm: Neal and Hinton (1993)