Week 1
Week Learning Objectives
By the end of this module, you will be able to
- Navigate the course website and Blackboard site
- Identify the Slack channels relevant for the course
- Describe the historical origin of Bayesian statistics
- Identify components in research papers involving Bayesian
analyses
- Knit a simple R Markdown file
Task List
- Review the syllabus
- Review the resources (slides and note)
- Install/Update R and RStudio on your computer
- Attend the Tuesday and Thursday class meetings
- Complete the assigned readings
- Introduce yourself on the #introduction Slack channel (as part of HW
1)
- Complete Homework 1 (see instruction on Blackboard)
Slides
PDF version
Week 2
Week Learning Objectives
By the end of this module, you will be able to
- Explain the three axioms/properties of
probability
- Describe the subjectivist interpretation of
probability, and contrast it with the frequentist
interpretation
- Explain the difference between probability mass and
probability density
- Compute probability density using simulations
- Compute joint, marginal, and conditional
probabilities with two variables
- Write an R function and use loops for repeating computations
Task List
- Complete the assigned readings
- Kruschke ch. 4
- Wickham & Grolemund ch. 7, 19.1-19.5
- Review the resources (slides and notes on probability and R
basics)
- Attend the Tuesday and Thursday class meetings
- Complete Homework 2 (see instruction on Blackboard)
Slides
PDF version
Week 3
Week Learning Objectives
By the end of this module, you will be able to
- Derive Bayes’ rule from the definition of
conditional probability
- Apply Bayes’ rule to obtain posterior from prior
and data
- Explain what data-order invariance and
exchangeability are
- Use grid approximation to obtain the posterior for
a Bernoulli model
- Describe the influence of sample size and
prior on the posterior
- Use R to perform prior predictive checks
Task List
- Complete the assigned readings
- Review the resources (slides and note)
- Attend the Tuesday and Thursday class meetings
- No homework for this week, but you may work on Q1 and Q2 for
Homework 3 (see instruction on Blackboard)
Slides
PDF version
Week 4
Week Learning Objectives
By the end of this module, you will be able to
- Apply Bayesian workflow to analyze real data with a
Bernoulli model
- Explain the idea of a conjugate prior
- Summarize the posterior distribution using
simulations
- Apply Bayesian terminology in summarizing the
posterior
- Use R to perform posterior predictive checks
Task List
- Complete the assigned readings
- Review the resources (slides and note)
- Attend the Tuesday and Thursday class meetings
- Complete Homework 3 (see instruction on Blackboard)
Slides
PDF
version
Week 5
Week Learning Objectives
By the end of this module, you will be able to
- Explain what is unique for samples using Markov
Chain Monte Carlo (MCMC)
- Explain why we need MCMC to approximate the
posterior
- Describe when MCMC samples are
representative and accurate for
approximating the posterior
- Use R to perform convergence diagnostics for MCMC
samples
Task List
- Complete the assigned readings
- Review the resources (slides and note)
- Attend the Tuesday and Thursday class meetings
- Complete Homework 4 (see instruction on Blackboard)
Slides
PDF version
Week 6
Week Learning Objectives
By the end of this module, you will be able to
- Apply Gibbs sampling to summarize parameters of a
normal model
Task List
- Complete the assigned readings
- Review the resources (slides and note)
- Attend the Tuesday and Thursday class meetings
- Complete Homework 5 (see instruction on Blackboard)
Slides
PDF version
Week 7
Week Learning Objectives
By the end of this module, you will be able to
- Describe, conceptually, how the Hamiltonian Monte Carlo
(HMC) algorithm achieves a better efficiency with the use of
the gradients
- Explain how tuning the step size and the
tree depth affects HMC
- Program a simple Bayesian model in Stan
Task List
- Complete the assigned readings
- Review the resources (slides and note)
- Attend the Tuesday and Thursday class meetings
Slides
PDF version
Week 8
Week Learning Objectives
By the end of this module, you will be able to
- Explain the logic of a hierarchical model
- Apply the binomial distribution to describe the sum
of multiple Bernoulli trials
- Program a hierarchical binomial model in Stan
- Analyze secondary data using a hierarchical normal model (i.e.,
random-effect meta-analysis)
Task List
- Complete the assigned readings
- Review the resources (slides and note)
- Attend the Tuesday and Thursday class meetings
- Start thinking about the class project
Slides
PDF
version
Lecture Videos
Hierarchical binomial
Hierarchical normal
Week 9
Week Learning Objectives
By the end of this module, you will be able to
- Conduct a Bayesian comparison of two groups
- Apply a \(t\)
model for robust modeling
- Select an appropriate distribution for different kinds of data
- Conduct comparisons with hierarchical data (e.g., within-subject
designs)
Task List
- Complete the assigned readings
- Review the resources (note)
- Attend class meetings
- Start thinking about the class project
Week 11
Week Learning Objectives
By the end of this module, you will be able to
- Describe the three components of the generalized linear
model (GLM)
- Name examples of the GLM (e.g., linear regression, Poisson
regression)
- Interpret the coefficients in a linear regression model
- Obtain posterior predictive distributions and checks
- Perform Bayesian regression with the R package
brms
Task List
- Complete the assigned readings
- Review the resources (note)
- Attend class meetings
- Complete Homework 7 (see instruction on Blackboard)
- Schedule a meeting with the instructor for your prospectus (see
sign-up link on Blackboard)
Slides
PDF version
Week 12–13
Week Learning Objectives
By the end of this module, you will be able to
- Draw a directed acyclic graph (DAG) to represent
causal assumptions
- Use a DAG to guide analyses for obtaining causal
effects
- Describe how randomization can remove potential
confounders
- Explain how the back-door criterion can be used to
identify a set of adjusted variables with nonexperimental data
- Perform a mediation analysis and interpret the
results
Task List
- Complete the assigned readings
- Review the resources (note)
- Attend class meetings
- Complete Homework 8 (see instruction on Blackboard)
Slides
PDF
version
Week 14
Week Learning Objectives
By the end of this module, you will be able to
- Describe the difference between subgroup analyses
and an interaction model
- Interpret results from an interaction model using
plots and posterior predictions
- Explain how information criteria approximates
out-of-sample divergence from the
“true” model
- Use WAIC and LOO-IC to compare
models
Task List
- Complete the assigned readings
- Review the resources (notes on interaction and model comparison)
- Attend class meetings
- Complete Homework 9 (see instruction on Blackboard)
Slides
PDF
version
Week 15
Week Learning Objectives
By the end of this module, you will be able to
- Use DAGs to describe the different missing data
mechanisms
- Use the
mi()
syntax in brms
to account for
missing data based on a DAG
- Explain the Bayesian ideas in the technique multiple
imputation
Task List
- Complete the assigned readings
- Review the resources (note23)
- Attend class meetings
- Complete Homework 10 (see instruction on Blackboard)
- Prepare your final project/paper