Approximate Bayesian Inference: Old and New

A 2-day tutorial at the DS3 workshop held at Ecole Polytechnic, Palaiseau
Taught by Emtiyaz Khan, AIP, RIKEN
TAs: Aaron Mishkin (UBC) and Didrik Nielsen (AIP, RIKEN)
Home Goals

Information

  • Tutorial notes are available! [ Part I ] [ Part I (annotated) ] [ Part II ] [ Part II (annotated) ]
  • During the tutorial, we will annotate (i.e., write on) these lecture notes. Attendees are also advised to do the same. Please print the notes beforehand (4 slides in 1 page to reduce number of papers) or you could use your tablet/iPad to add notes in the electronic copy directly.
  • Around 20-25% of the tutorial consists of programming exercises, so don't forget to bring your laptop!. The code is available at this github repo.
  • We will use Python 3 for all exercises. The following packages are required: numpy, scipy, torch, matplotlib, jupyter, ipywidgets.

What will you learn?

You will learn the following topics:
  • What is Bayesian Inference? When and where is it useful? And Why? Why is it computationally challenging for these models? [ Download notes here ] [ Annotated notes here ]
    • Bayesian linear regression
    • Bayesian logistic regression
    • Bayesian neural networks
    • Gaussian processes
  • What are some (old and new) methods to solve these challenges? [ Download notes here ] [ Part II (annotated) ]
    • (old) Laplace Approximation
    • (old) Variational Inference (mean-field, VMP)
    • (new) Stochastic gradient methods (BBVI)
    • (new) Natural gradient methods (SVI, CVI)
    • (new) Methods for Bayesian deep learning (BBB, Vadam)
    • (new) Variational Auto-Encoders

We will have the following four programming exercises (around 4 hours). We will use Python 3 for all exercises. The following packages are required: numpy, scipy, torch, matplotlib, jupyter, ipywidgets.

You may also want to revise the following concepts: