This is the web page for the Bayesian Data Analysis course at Aalto (CS-E5710) by Aki Vehtari.
Aalto students should check also MyCourses announcements. In Autumn 2020 the course will be arranged completely online. This web page will be much updated during the August.
All the course material is available in a git repo (and these pages are for easier navigation). All the material can be used in other courses. Text (except the BDA3 book) and videos licensed under CC-BY-NC 4.0. Code licensed under BSD-3.
The electronic version of the course book Bayesian Data Analysis, 3rd ed, by by Andrew Gelman, John Carlin, Hal Stern, David Dunson, Aki Vehtari, and Donald Rubin is available for non-commercial purposes. Hard copies are available from the publisher and many book stores. See also home page for the book, errata for the book, and chapter notes.
- Basic terms of probability theory
- Some algebra and calculus
- Basic visualisation techniques (R or Python)
This course has been designed so that there is strong emphasis in computational aspects of Bayesian data analysis and using the latest computational tools.
If you find BDA3 too difficult to start with, I recommend
Course contents following BDA3
Bayesian Data Analysis, 3rd ed, by by Andrew Gelman, John Carlin, Hal Stern, David Dunson, Aki Vehtari, and Donald Rubin. Home page for the book. Errata for the book. Electronic edition for non-commercial purposes only.
- Background (Ch 1, Lecture 1)
- Single-parameter models (Ch 2, Lecture 2)
- Multiparameter models (Ch 3, Lecture 3)
- Computational methods (Ch 10 , Lecture 4)
- Markov chain Monte Carlo (Chs 11-12, Lectures 5-6)
- Extra material for Stan and probabilistic programming (see below, Lecture 6)
- Hierarchical models (Ch 5, Lecture 7)
- Model checking (Ch 6, Lectures 8-9)
- Evaluating and comparing models (Ch 7)
- Decision analysis (Ch 9, Lecture 10)
- Large sample properties and Laplace approximation (Ch 4, Lecture 11-12)
- In addition you learn workflow for Bayesian data analysis
How to study
Recommended way to go through the material is
- Read the reading instructions for a chapter in chapter notes.
- Read the chapter in BDA3 and check that you find the terms listed in the reading instructions.
- Watch the corresponding lecture video to get explanations for most important parts.
- Read corresponding additional information in the chapter notes.
- Run the corresponding demos in R demos or Python demos.
- Read the exercise instructions and make the corresponding assignments. Demo codes in R demos and Python demos have a lot of useful examples for handling data and plotting figures. If you have problems, visit TA sessions or ask in course slack channel.
- If you want to learn more, make also self study exercises listed below
Slides and chapter notes
- including code for reproducing some of the figures
- Chapter notes
- including reading instructions highlighting most important parts and terms
The following video motivates why computational probabilistic methods and probabilistic programming are important part of modern Bayesian data analysis.
Short video clips on selected introductory topics are available in a Panopto folder and listed below.
2019 fall lecture videos are in a Panopto folder and listed below.
- Lecture 2.1 and Lecture 2.2 on basics of Bayesian inference, observation model, likelihood, posterior and binomial model, predictive distribution and benefit of integration, priors and prior information, and one parameter normal model. BDA3 Ch 1+2.
- Extra 2 recorded 2020 with extra explanations about likelihood, normalization term, density, and conditioning on model M.
- Lecture 3 on multiparameter models, joint, marginal and conditional distribution, normal model, bioassay example, grid sampling and grid evaluation. BDA3 Ch 3.
- Lecture 4.1 on numerical issues, Monte Carlo, how many simulation draws are needed, how many digits to report, and Lecture 4.2 on direct simulation, curse of dimensionality, rejection sampling, and importance sampling. BDA3 Ch 10.
- Lecture 5.1 on Markov chain Monte Carlo, Gibbs sampling, Metropolis algorithm, and Lecture 5.2 on warm-up, convergence diagnostics, R-hat, and effective sample size. BDA3 Ch 11.
- Lecture 6.1 on HMC, NUTS, dynamic HMC and HMC specific convergence diagnostics, and Lecture 6.2 on probabilistic programming and Stan. BDA3 Ch 12 + extra material.
- Lecture 7.1 on hierarchical models, and Lecture 7.2 on exchangeability. BDA3 Ch 5.
- Project work info
- Lecture 8.1 on model checking, and Lecture 8.2 on cross-validation part 1. BDA3 Ch 6-7 + extra material.
- Lecture 9.1 PSIS-LOO and K-fold-CV, Lecture 9.2 model comparison and selection, and Lecture 9.3 extra lecture on variable selection with projection predictive variable selection. Extra material.
- Lecture 10.1 on decision analysis. BDA3 Ch 9.
- Project presentation info
- Lecture 11.1 on normal approximation (Laplace approximation) and Lecture 11.2 on large sample theory and counter examples. BDA3 Ch 4.
- Lecture 12.1 on frequency evaluation, hypothesis testing and variable selection and Lecture 12.2 overview of modeling data collection, BDA3 Ch 8, linear models, BDA Ch 14-18, lasso, horseshoe and Gaussian processes, BDA3 Ch 21.
R and Python
We strongly recommend using R in the course as there are more packages for Stan and statistical analysis in R. If you are already fluent in Python, but not in R, then using Python may be easier, but it can still be more useful to learn also R. Unless you are already experienced and have figured out your preferred way to work with R, we recommend installing RStudio Desktop. See FAQ for frequently asked questions about R problems in this course. The demo codes provide useful starting points for all the assignments.
- For learning R programming basics we recommend
- For learning basic and advanced plotting using R we recommend
Self study exercises
Great self study BDA3 exercises for this course are listed below. Most of these have also model solutions available.
- 1.1-1.4, 1.6-1.8 (model solutions for 1.1-1.6)
- 2.1-2.5, 2.8, 2.9, 2.14, 2.17, 2.22 (model solutions for 2.1-2.5, 2.7-2.13, 2.16, 2.17, 2.20, and 2.14 is in slides)
- 3.2, 3.3, 3.9 (model solutions for 3.1-3.3, 3.5, 3.9, 3.10)
- 4.2, 4.4, 4.6 (model solutions for 3.2-3.4, 3.6, 3.7, 3.9, 3.10)
- 5.1, 5.2 (model solutions for 5.3-5.5, 5.7-5.12)
- 6.1 (model solutions for 6.1, 6.5-6.7)
- 10.1, 10.2 (model solution for 10.4)
- 11.1 (model solution for 11.1)
The course material has been greatly improved by the previous and current course assistants (in alphabetical order): Michael Riis Andersen, Paul Bürkner, Akash Dakar, Alejandro Catalina, Kunal Ghosh, Joona Karjalainen, Juho Kokkala, Måns Magnusson, Janne Ojanen, Topi Paananen, Markus Paasiniemi, Juho Piironen, Jaakko Riihimäki, Eero Siivola, Tuomas Sivula, Teemu Säilynoja, Jarno Vanhatalo.
The web page has been made with rmarkdown’s site generator.