- Rémi Bardenet (CNRS, Univ. Lille), https://rbardenet.github.io
- Gabriel Victorino Cardoso (Mines Paris - PSL Univ.), https://gabrielvc.github.io/
- Julyan Arbel (Inria, Univ. Grenoble-Alpes), https://www.julyanarbel.com
- Incomplete and drafty lecture notes are available in the
notesfolder. Any comment welcome, live or as a raised issue. - Practicals and exercises will be made available in the corresponding folders. They are to be done on a voluntary basis. Solutions will be provided on demand.
- Here are the annotated slides for the Lecture 1.
- Annotated slides for Lecture 2 are here and here.
- Annotated slides for Lecture 3 are here.
- Slides for Lecture 4 are in the
slidesfolder. Mostly covered everything until Doob's theorem (slide 131). - Annotated slides for Lecture 5 are here and here.
- Sildes for lecture 6 are here and slides for lecture 7 are in the
slidesfolder.
By the end of the course, the students should
- have a high-level view of the main approaches to making decisions under uncertainty.
- be able to detect when being Bayesian helps and why.
- be able to design and run a Bayesian ML pipeline for standard supervised or unsupervised learning.
- have a global view of the current limitations of Bayesian approaches and the research landscape.
- be able to understand the abstract of most Bayesian ML papers.
- Decision theory
- 50 shades of Bayes: Subjective and objective interpretations
- Bayesian supervised and unsupervised learning
- Bayesian computation for ML: Advanced Monte Carlo and variational methods
- Bayesian nonparametrics
- Generative models
| Date | Lecture | Prof |
|---|---|---|
| 15/1 | Bayesics | Rémi Bardenet |
| 22/1 | MCMC | Rémi Bardenet |
| 29/1 | Variational Bayes | Rémi Bardenet |
| 5/2 | Bayesian nonparametrics | Julyan Arbel |
| 12/2 | Foundations | Rémi Bardenet |
| 19/2 | Generative models 1 | Gabriel Victorino Cardoso |
| 26/2 | Generative models 2 | Gabriel Victorino Cardoso |
| 12/3 | Student's seminar | Rémi, Gabriel, Julyan |
- An undergraduate course in probability.
- It is recommended to have followed either "Probabilistic graphical models" or "Computational statistics" during the first semester.
- 8x3 hours of lectures, the last session being a student seminar.
- All classes and all material will be in English. Students may write their final report in either French or English.
- The course takes place in Ecole des Mines, parc du Lexembourg in 2026.
- There will be a light form of continuous evaluation this year.
- For the main evaluation, students form groups. Each group reads and reports on a research paper from a list. We strongly encourage a dash of creativity: students should identify a weak point, shortcoming or limitation of the paper, and try to push in that direction. This can mean extending a proof, implementing another feature, investigating different experiments, etc. Deliverables are a small report and a short oral presentation in front of the class, in the form of a student seminar, which will take place during the last lecture.
- Parmigiani, G. and Inoue, L. 2009. Decision theory: principles and approaches. Wiley.
- Robert, C. 2007. The Bayesian choice. Springer.
- Murphy, K. 2023. Probabilistic Machine Learning: Advanced Topics. MIT Press. pdf available at this link.
- Ghosal, S., & Van der Vaart, A. W. 2017. Fundamentals of nonparametric Bayesian inference. Cambridge University Press.