Chapter 4 Simulation methods

In the previous chapters, we focused on conjugate families, where the posterior and predictive distributions have standard analytical forms (e.g., normal, Student’s t, gamma, binomial, Poisson, etc.) and where the marginal likelihood has a closed-form analytical solution. However, realistic models are often more complex and lack such closed-form solutions.

To address this complexity, we rely on simulation (stochastic) methods to draw samples from posterior and predictive distributions. This chapter introduces posterior simulation, a cornerstone of Bayesian inference. We discuss Markov Chain Monte Carlo (MCMC) methods, including Gibbs sampling, Metropolis-Hastings, and Hamiltonian Monte Carlo, as well as other techniques like importance sampling and particle filtering (sequential Monte Carlo).

The simulation methods discussed in this chapter are specifically applied throughout this book. However, we do not delve into deterministic methods, such as numerical integration (quadrature), or other simulation methods, including discrete approximation, the probability integral transform, the method of composition, accept-reject sampling, and slice sampling algorithms. While these methods are also widely used, they are not as common as the approaches explicitly employed in this book.

For readers interested in these alternative methods, we recommend exploring Christian P. Robert, Casella, and Casella (2010), Christian P. Robert and Casella (2011), Greenberg (2012), and Andrew Gelman et al. (2021).

References

Gelman, Andrew, John B Carlin, Hal S Stern, David Dunson, Aki Vehtari, and Donald B Rubin. 2021. Bayesian Data Analysis. Chapman; Hall/CRC.
Greenberg, Edward. 2012. Introduction to Bayesian Econometrics. Cambridge University Press.
Robert, Christian P., and George Casella. 2011. Monte Carlo Statistical Methods. 2nd ed. New York: Springer.
Robert, Christian P, George Casella, and George Casella. 2010. Introducing Monte Carlo Methods with r. Vol. 18. Springer.