Bayesian Computations

Ph.D. in Economics, Statistics, and Data Science - University of Milano-Bicocca

Author
Affiliation

Tommaso Rigon

DEMS

Bayesian computations is a module of the PhD course Bayesian statistics. The course covers both theoretical and programming aspects.

Prerequisites

The course is modular, meaning that depending on the background of the participants different units will be covered. However, it is assumed the knowledge of the following topics:

  • Fundamentals of Bayesian statistics. Refer to Chapters 1-5 of Hoff (2009).
  • Monte Carlo integration. Refer to Chapter 3 of Robert and Casella (2009), or Chapter 4 of Hoff (2009).

Preliminary references

Further preliminary material is available (in Italian) at the website of the course R for the multivariate statistical analysis.

Teaching material

Topic Slides Code
MCMC methods
Metropolis-Hastings and Gibbs sampling Unit A.1 Code A.1
Rcpp & RcppArmadillo Unit A.2 Code A.2
Advanced MCMC algorithms
Optimal scaling & adaptive Metropolis Unit B.1 Code B.1
MALA algorithm & Hamiltonian Monte Carlo Unit B.2 Code B.2
Homework Homework 1
Data augmentation
Missing data problems, Gibbs sampling and the EM algorithm Unit C.1
Data augmentation for probit and logit models Unit C.2 Code B.2
EM logistic tutorial
Homework Homework 2
Approximate Bayesian methods
Laplace appr., Variational Bayes, and Expectation Propagation Unit D.1
Approximate methods for probit and logit models Unit D.2 Code D.2

Essential references

  1. Albert, J. H. and Chib, S. (1993). Bayesian analysis of binary and polychotomous response data. Journal of the American Statistical Association, 88(422), 669–679.

  2. Blei, D. M., Kucukelbirb A., and McAuliffe, J. D. (2017). Variational inference: a review for statisticians. Journal of the American Statistical Association, 112(518), 859–877.

  3. Chopin, N. and Ridgway, J. (2017). Leave Pima indians alone: binary regression as a benchmark for Bayesian computation. Statistical Science, 32(1), 64–87.

  4. Dempster, A. P., Laird, N. M., & Rubin, D. B. (1977). Maximum likelihood from incomplete data via the EM algorithm. Journal of the Royal Statistical Society. Series B: Statistical Methodology, 39(1), 1–38.

  5. Durante, D. (2019). Conjugate Bayes for probit regression via unified skew-normal distributions. Biometrika, 106(4), 765–779.

  6. Durante, D. and Rigon, T. (2019). Conditionally conjugate mean-field variational Bayes for logistic models. Statistical Science, 34(3), 472–485.

  7. Dunson, D. B. and Johndrow, J. E. (2020). The Hastings algorithm at fifthy. Biometrika, 107(1), 1–23.

  8. Eddelbuettel, D. and Balamuta, J. J. (2018). Extending R with C++: a brief introduction to Rcpp. The American Statistician, 72(1), 28–36.

  9. Hunter, D. R., and Lange, K. (2004). A Tutorial on MM Algorithms. The American Statistician, 58(1), 30–37.

  10. Neal, R. M. (2011). MCMC using Hamiltonian dynamics. CRC press.

  11. Polson, N. G., Scott, J. G. and Windle J. (2013). Bayesian inference for logistic models using Pólya–Gamma latent variables. Journal of the American Statistical Association, 108(504), 1339–1349.

  12. Roberts, G. O. and Rosenthal, J. S. (2001). Optimal scaling for various Metropolis-Hastings algorithms. Statistical Science, 16(4), 351–367.

  13. Roberts, G. O. and Rosenthal, J. S. (2009). Examples of adaptive MCMC. Journal of Computational and Graphical Statistics, 18(2), 349–367.