Skip to content

Public repo for course material on Bayesian machine learning at ENS Paris-Saclay and Univ Lille

Notifications You must be signed in to change notification settings

rbardenet/bml-course

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

269 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Bayesian machine learning

Lecturers

Material

  • Incomplete and drafty lecture notes are available in the notes folder. Any comment welcome, live or as a raised issue.
  • Practicals and exercises will be made available in the corresponding folders. They are to be done on a voluntary basis. Solutions will be provided on demand.
  • Here are the annotated slides for the Lecture 1.
  • Annotated slides for Lecture 2 are here and here.
  • Annotated slides for Lecture 3 are here.
  • Slides for Lecture 4 are in the slides folder. Mostly covered everything until Doob's theorem (slide 131).
  • Annotated slides for Lecture 5 are here and here.
  • Sildes for lecture 6 are here and slides for lecture 7 are in the slidesfolder.

Objective of the course

By the end of the course, the students should

  • have a high-level view of the main approaches to making decisions under uncertainty.
  • be able to detect when being Bayesian helps and why.
  • be able to design and run a Bayesian ML pipeline for standard supervised or unsupervised learning.
  • have a global view of the current limitations of Bayesian approaches and the research landscape.
  • be able to understand the abstract of most Bayesian ML papers.

Topics

  • Decision theory
  • 50 shades of Bayes: Subjective and objective interpretations
  • Bayesian supervised and unsupervised learning
  • Bayesian computation for ML: Advanced Monte Carlo and variational methods
  • Bayesian nonparametrics
  • Generative models

Outline

Date Lecture Prof
15/1 Bayesics Rémi Bardenet
22/1 MCMC Rémi Bardenet
29/1 Variational Bayes Rémi Bardenet
5/2 Bayesian nonparametrics Julyan Arbel
12/2 Foundations Rémi Bardenet
19/2 Generative models 1 Gabriel Victorino Cardoso
26/2 Generative models 2 Gabriel Victorino Cardoso
12/3 Student's seminar Rémi, Gabriel, Julyan

Prerequisites

  • An undergraduate course in probability.
  • It is recommended to have followed either "Probabilistic graphical models" or "Computational statistics" during the first semester.

Organization of courses

  • 8x3 hours of lectures, the last session being a student seminar.
  • All classes and all material will be in English. Students may write their final report in either French or English.
  • The course takes place in Ecole des Mines, parc du Lexembourg in 2026.

Validation

  • There will be a light form of continuous evaluation this year.
  • For the main evaluation, students form groups. Each group reads and reports on a research paper from a list. We strongly encourage a dash of creativity: students should identify a weak point, shortcoming or limitation of the paper, and try to push in that direction. This can mean extending a proof, implementing another feature, investigating different experiments, etc. Deliverables are a small report and a short oral presentation in front of the class, in the form of a student seminar, which will take place during the last lecture.

References

  • Parmigiani, G. and Inoue, L. 2009. Decision theory: principles and approaches. Wiley.
  • Robert, C. 2007. The Bayesian choice. Springer.
  • Murphy, K. 2023. Probabilistic Machine Learning: Advanced Topics. MIT Press. pdf available at this link.
  • Ghosal, S., & Van der Vaart, A. W. 2017. Fundamentals of nonparametric Bayesian inference. Cambridge University Press.

About

Public repo for course material on Bayesian machine learning at ENS Paris-Saclay and Univ Lille

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors