4.6 Exercises
Example: The normal model with independent priors
Let’s recap the math test exercise in Chapter 3, this time assuming independent priors. Specifically, let \(Y_i \sim N(\mu, \sigma^2)\), where \(\mu \sim N(\mu_0, \sigma_0^2)\) and \(\sigma^2 \sim IG(\alpha_0 / 2, \delta_0 / 2)\). The sample size is 50, and the mean and standard deviation of the math scores are 102 and 10, respectively. We set \(\mu_0 = 100\), \(\sigma_0^2 = 100\), and \(\alpha_0 = \delta_0 = 0.001\).
- Find the posterior distribution of \(\mu\) and \(\sigma^2\).
- Program a Gibbs sampler algorithm and plot the histogram of the posterior draws of \(\mu\).
Show that the Gibbs sampler is a particular case of the Metropolis-Hastings where the acceptance probability is equal to 1.
Implement a Metropolis-Hastings to sample from the Cauchy distribution, \(C(0,1)\), using as proposals a standard normal distribution and a Student’s t distribution with 5 degrees of freedom.
This exercise was proposed by Professor Hedibert Freitas Lopes, who cites Thomas and Tu (2021) as a useful reference for an introduction to Hamiltonian Monte Carlo in R and the hmclearn package. The task is to obtain posterior draws using the Metropolis-Hastings and Hamiltonian Monte Carlo algorithms for the posterior distribution given by \[ \pi(\theta_1,\theta_2\mid \mathbf{y}) \propto \exp\left\{-\frac{1}{2}(\theta_1^2\theta_2^2 + \theta_1^2 + \theta_2^2 - 8\theta_1 - 8\theta_2)\right\}. \]
Ph.D. students sleeping hours continues
- Use importance sampling based on a \(U(0,1)\) proposal to obtain draws of \(\boldsymbol{\theta}\mid \mathbf{y} \sim B(16.55,39.57)\) in the Ph.D. students’ sleeping hours example in Chapter 3. Note that, based on Exercise 15 in Chapter 3, \(\alpha_0 = 1.44\) and \(\beta_0 = 2.57\).
- Compute the marginal likelihood in this context (Bernoulli-Beta model) and compare it to the result obtained using the Gelfand-Dey method.
Example 4.1 in Gordon, Salmond, and Smith (1993) is \[\begin{align*} \theta_t &= 0.5\theta_{t-1} + 25\frac{\theta_{t-1}}{1+\theta_{t-1}^2} + 8 \cos(1.2t) + w_t \\ y_t &= \frac{\theta_{t}^2}{20} + \mu_t, \end{align*}\] where \(\theta_0 \sim N(0, \sqrt{10})\), \(w_t \sim \mathcal{N}(0, \sqrt{10})\) and \(\mu_t \sim N(0, \sqrt{1})\).
- Perform sequential importance sampling in this example.
- Perform particle (Bootstrap) filtering in this example.
- Estimate the marginal likelihood in this example.
Ph.D. students sleeping hours continues
- Perform the diagnostics of Section 4.4 in this example.
- Check if there are errors in the posterior simulator of the Metropolis-Hastings algorithm in this example using the Geweke approach using as test functions the first moments of \(p\) and \(p^2\). Remember from Exercise 15 in Chapter 3 that the sample size is 52, and \(\alpha_0 = 1.22\) and \(\beta_0 = 2.57\).
- Run the Geweke test using \(\alpha_0 = 2.57\) and \(\beta_0 = 1.22\), and check the results.