Videos The following video motivates why computational probabilistic methods and probabilistic programming are important part of modern Bayesian data analysis. Computational probabilistic modeling in 15mins Short video clips on selected introductory topics are available in a Panopto folder and listed below. 1.1 Introduction to uncertainty and modelling 1.2 Introduction to the course contents 2.1 Observation model, likelihood, posterior and binomial model 2.2 Predictive distribution and benefit of integration 2.3 Priors and prior information 2019 fall lecture videos are in a Panopto folder and listed below. Lecture 2.1 and Lecture 2.2 on basics of Bayesian inference, observation model, likelihood, posterior and binomial model, predictive distribution and benefit of integration, priors and prior information, and one parameter normal model (BDA3 Ch 1+2). Lecture 3 on multiparameter models, joint, marginal and conditional distribution, normal model, bioassay example, grid sampling and grid evaluation (BDA3 Ch 3). Lecture 4.1 on numerical issues, Monte Carlo, how many simulation draws are needed, how many digits to report, and Lecture 4.2 on direct simulation, curse of dimensionality, rejection sampling, and importance sampling (BDA3 Ch 10). Lecture 5.1 on Markov chain Monte Carlo, Gibbs sampling Metropolis algorithm, and Lecture 5.2 on warm-up, convergence diagnostics, R-hat, and effective sample size (BDA3 Ch 11). Lecture 6.1 on HMC, NUTS, dynamic HMC and HMC specific convergence diagnostics, and Lecture 6.2 on probabilistic programming and Stan (BDA3 Ch 12 + extra material). Lecture 7.1 on hierarchical models, and Lecture 7.2 on exchangeability (BDA3 Ch 5). Project work info Lecture 8.1 on model checking, and Lecture 8.2 on cross-validation part 1 (BDA3 Ch 6 + extra material). Lecture 9.1 PSIS-LOO and K-fold-CV, Lecture 9.2 model comparison and selection, and Lecture 9.3 extra lecture on variable selection with projection predictive variable selection (extra material). Lecture 10.1 on decision analysis (BDA3 Ch 9). Project presentation info Lecture 11.1 on normal approximation (Laplace approximation) and Lecture 11.2 on large sample theory and counter examples (BDA3 Ch 4). Lecture 12.1 on frequency evaluation, hypothesis testing and variable selection and Lecture 12.2 overview of modeling data collection (Ch8), linear models (Ch. 14-18), lasso, horseshoe and Gaussian processes (Ch 21).