1.4 Summary

We introduce Bayes’ rule to update probabilistic statements using humorous examples. We then study the three key probabilistic objects in Bayesian inference: the posterior distribution, the marginal likelihood, and the predictive density. The posterior distribution allows for inference regarding parameters, the marginal likelihood is required for hypothesis testing and model selection using the Bayes factor, and the predictive density enables probabilistic predictions. We also review some sampling properties of Bayesian estimators and the process of Bayes updating. All of these concepts were illustrated using a simple example in R software. Finally, we introduce decision theory concepts that can be applied to report summary statistics while minimizing posterior expected losses.