The Bayesian approach to statistical inference represents a reversal of traditional (or frequentist) inference, in which data are viewed as being fixed and model parameters as unknown quantities. Interest and application of Bayesian methods have exploded in recent decades, being facilitated by recent advances in computational power. We begin with an introduction to Bayes’ Theorem, the theoretical underpinning of Bayesian statistics which dates back to the 1700’s, and the concepts of prior and posterior distributions, conjugacy, and closed-form Bayesian inference. Building on this, we introduce modern computational approaches to Bayesian inference, including Markov chain Monte Carlo (MCMC), Metropolis-Hastings sampling, and the theory underlying these simple and powerful methods. Students will become comfortable with modern software tools for MCMC using a variety of applied hierarchical modeling examples, and will use R for all statistical computing.
The Class: Type: lecture
Requirements/Evaluation: evaluation will be based on homework and exams
Prerequisites: STAT 201 and MATH 150 and 250, or permission of instructor
Enrollment Preference: juniors and seniors, Statistics majors
Distributions: Division III; Quantative/Formal Reasoning;