- Tutorial notes are available! [ Part I ] [ Part I (annotated) ] [ Part II ] [ Part II (annotated) ]
- During the tutorial, we will annotate (i.e., write on) these lecture notes. Attendees are also advised to do the same. Please print the notes beforehand (4 slides in 1 page to reduce number of papers) or you could use your tablet/iPad to add notes in the electronic copy directly.
- Around 20-25% of the tutorial consists of programming exercises, so don't forget to bring your laptop!. The code is available at this github repo.
- We will use Python 3 for all exercises. The following packages are required: numpy, scipy, torch, matplotlib, jupyter, ipywidgets.

You will learn the following topics:

We will have the following four programming exercises (around 4 hours).

You may also want to revise the following concepts:

- What is Bayesian Inference? When and where is it useful? And Why? Why is it computationally challenging for these models? [ Download notes here ] [ Annotated notes here ]
- Bayesian linear regression
- Bayesian logistic regression
- Bayesian neural networks
- Gaussian processes
- What are some (old and new) methods to solve these challenges? [ Download notes here ] [ Part II (annotated) ]
- (old) Laplace Approximation
- (old) Variational Inference (mean-field, VMP)
- (new) Stochastic gradient methods (BBVI)
- (new) Natural gradient methods (SVI, CVI)
- (new) Methods for Bayesian deep learning (BBB, Vadam)
- (new) Variational Auto-Encoders

We will have the following four programming exercises (around 4 hours).

- Exercise on Bayesian linear regression, Password for solutions (05524).
- Exercise on Laplace's method, Password for solutions (86610).
- Exercise on variational inference (VI), Password for solutions (00192).
- Exercise on VI on Bayesian neural networks, Password for solutions (6422).

You may also want to revise the following concepts: