Supplementary Exercises 4.3#

Warning

This page contains solutions! We recommend attempting each problem before peeking.

1. Expectation and Variance#

Find \(E(X)\) and \(\text{Var}(X)\) if \(X\) has a density:

  1. \(f(x) = 3x^2\), \(0 \le x \le 1\);

  2. \(f(x) = \sin(x)\), \(0 \le x \le \pi/2\).

2. Uniform on Unit Circle#

Let bivariate random variable \((X, Y)\) have a density

\[f(x, y) = \frac{1}{\pi}, \quad x^2 + y^2 \le 1.\]
  1. Find \(f_{Y|X}(y | x)\).

  2. What is the conditional density \(f(y | x)\) for \(x = 0\)?

3. Fungi Spores#

Fungi spores are elliptical in shape with minor and major axes used to describe their size. Let \(X\) and \(Y\) denote lengths of the minor and major axes (in micrometers):

\[X \sim Exp(1)\]
\[Y \mid X = x \sim f(y | x) = e^{- (y-x)} \quad \text{for} \space y \geq x \]

Find:

  1. joint distribution \(f_{(X,Y)}(x, y)\);

  2. \(P(Y \leq 2 | X = 1)\);

  3. \(E(Y | X = 1)\);

  4. constant \(C\), such that \(P(Y \leq c) = 0.5\);

  5. Are \(X\) and \(Y\) independent?

4. Joint, Marginals and Conditionals #1#

Let random variables \(X\) and \(Y\) have joint density:

\[ f_{(X,Y)}(x, y) = \lambda^2 e^{-\lambda y}, \quad 0 \le x \le y \]

Find:

  1. marginal densities \(f_X(x)\) and \(f_Y(y)\);

  2. conditional densities of \(X\) given \(Y = y\), \(f_{X|Y}(x | y)\), and of \(Y\) given \(X = x\), \(f_{Y|X}(y|x)\);

  3. the CDF of \((X, Y)\), \(F_{(X,Y)}(x, y) = P(X \le x, Y \le y)\).

Hide code cell content
import matplotlib.pyplot as plt
import numpy as np
from myst_nb import glue

x = 2
y = 3

fig, ax = plt.subplots()

# region to fill
ax.fill_between([0, x], [0, x], y, color="lightcoral", alpha=0.3)

# x = y
x_vals = np.linspace(0, max(x, y), 400)
y_vals = x_vals
ax.plot(x_vals, y_vals, label="$x = y$", color="black")

# domain boundaries
ax.plot([x, x], [0, y], "g--")
ax.plot([0, x], [y, y], "g--")

ax.set_xlim(0, max(x, y) * 1.1)
ax.set_ylim(0, max(x, y) * 1.1)
ax.set_xlabel("$x$")
ax.set_ylabel("$y$")
ax.set_title("Integration Domain")
ax.legend()
plt.grid(True)

plt.savefig("../images/q4_plot1.png")
../_images/9137b8ad4e5198b47cf8396b3bf30b1ff64165bbf083a02f5b6d872c531a6676.png
Hide code cell content
x = 4
y = 2

fig, ax = plt.subplots()

# region to fill
ax.fill_between([0, y], [0, y], y, color="lightcoral", alpha=0.3)

# x = y
x_vals = np.linspace(0, max(x, y), 400)
y_vals = x_vals
ax.plot(x_vals, y_vals, label="$x = y$", color="black")

# domain boundaries
ax.plot([x, x], [0, y], "g--")
ax.plot([0, x], [y, y], "g--")

ax.set_xlim(0, max(x, y) * 1.1)
ax.set_ylim(0, max(x, y) * 1.1)
ax.set_xlabel("$x$")
ax.set_ylabel("$y$")
ax.set_title("Integration Domain")
ax.legend()
plt.grid(True)

plt.savefig("../images/q4_plot2.png")
../_images/6a83abf56e9aa9dd12402f7aa82ba6ef991e6a7c88f10ac05e82dafe1afd99a0.png

5. Joint, Marginals and Conditionals #2#

Let random variables \(X\) and \(Y\) have joint density

\[ f_{(X,Y)}(x, y) = Cxy^2, \quad x \ge 0, \quad y \ge 0, \quad x + y \le 1 \]
  1. Show that

\[ C = 60, \quad f_X(x) = 20x(1 - x)^3, \quad 0 \le x \le 1 \]

and

\[ f_{Y|X}(y | x) = \frac{3y^2}{(1-x)^3}, \quad 0 \le y \le 1 - x. \]
  1. Find \(f_Y(y)\) and \(f_{X|Y}(x | y)\).

Hide code cell content
import numpy as np
import matplotlib.pyplot as plt
from myst_nb import glue


x = np.linspace(0, 1, 1000)
y = 1 - x

fig, ax = plt.subplots()

ax.fill_between(x, 0, y, color="lightblue", label="Domain of $(X, Y)$")
ax.plot(x, y, color="blue", label="$y = 1 - x$")

ax.set_xlabel("x")
ax.set_ylabel("y")
ax.legend()
plt.savefig("../images/q5_plot.png")
../_images/e7e001e1a0cf2135b2b701fcf7ec41d51a0a1b55e25a7cf2b40e9d8bced28eb6.png

6. Joint, Marginals and Conditionals #3#

Let random variables \(X\) and \(Y\) have joint density

\[\begin{split} f_{X,Y}(x, y) = \begin{cases} \frac{1+xy}{4}, & -1 \le x \le 1, -1 \le y \le 1 \\ 0, & \text{else} \end{cases} \end{split}\]

Find:

  1. marginal densities \(f_X(x)\) and \(f_Y(y)\);

  2. conditional densities of \(X\) given \(Y = y\), \(f_{X|Y}(x | y)\), and of \(Y\) given \(X = x\), \(f_{Y|X}(y | x)\).

  3. Are \(X\) and \(Y\) independent?

7. Joint, Marginals and Conditionals #4#

Let random variables \(X\) and \(Y\) have joint density \(f_{X,Y}(x, y)\) given by:

\[\begin{split} f_{X,Y}(x, y) = \begin{cases} C(x^2 - y^2)e^{-x} & \text{for } 0 \leq x < \infty, -x \leq y \leq x \\ 0 & \text{else} \end{cases} \end{split}\]

Find

  1. constant \(C\).

\[ 1 = \int_{0}^{\infty} \int_{-x}^{x} C (x^2 - y^2) e^{-x} \, dy \, dx = \frac{4C}{3} \int_{0}^{\infty} x^3 e^{-x} \, dx = \frac{4C}{3} \times \Gamma(4) = 8C. \]

Thus, \(C = \frac{1}{8}\). (Recall, \(\Gamma(n) = (n − 1)!\))

  1. marginal densities \(f_X(x)\) and \(f_Y(y)\).

\[f_X(x) = \frac{1}{6} x^3 e^{-x} = \frac{x^{4-1}}{\Gamma(4)} e^{-1 \cdot x}, x \geq 0\]

This is Gamma \(Ga(4, 1)\) distribution. To find \(f_Y(y)\), when integrating out \(x\), for \(y \geq 0\), the integration is \(\int_{y}^{\infty}\), and for \(y < 0\) the integration is \(\int_{-y}^{\infty}\). You will need to do integration by parts. When the smoke clears,

  1. conditional distributions \(f_{X|Y}(x|y)\) and \(f_{Y|X}(y|x)\).

Hide code cell content
import matplotlib.pyplot as plt
import numpy as np

# x values (this could go to infinity but this is enough to get an idea)
x_vals = np.linspace(0, 5, 100)

# upper and lower bounds for y based on x
y_upper = x_vals
y_lower = -x_vals

plt.fill_between(x_vals, y_lower, y_upper, color="blue", alpha=0.5)
plt.xlabel("X")
plt.ylabel("Y")
plt.title("Domain of $f_{X,Y}(x, y)$")
plt.grid(True)

plt.savefig("../images/q7_plot.png")
../_images/9e19bc5eb6ae5e0b0749d4a3878e1cb8af086000fe46f7054f85fcf942a33331.png

8. A 2D PDF#

Let

\[\begin{split} f(x, y) = \begin{cases} \frac{3}{8} (x^2 + 2xy), & 0 \le x \le 1, 0 \le y \le 2 \\ 0, & \text{else} \end{cases} \end{split}\]

be a bivariate PDF of a random vector \((X, Y)\).

  1. Show that \(f(x, y)\) is a density.

  2. Show that marginal distributions are

\[ f_X(x) = \frac{3}{2}x + \frac{3}{4}x^2 , 0 \le x \le 1 \]

and

\[ f_Y(y) = \frac{1}{8} + \frac{3}{8}y , 0 \le y \le 2 \]
  1. Show \(E(X) = \frac{11}{16}\) and \(E(Y) = \frac{5}{4}\).

  2. Show that conditional distributions are

\[ f(x | y) = \frac{3x(x + 2y)}{1 + 3y}, \quad 0 \le x \le 1, \text{ for any fixed } y \in [0, 2], \]
\[ f(y | x) = \frac{2y + x}{4 + 2x}, \quad 0 \le y \le 2, \text{ for any fixed } x \in [0, 1]. \]

9. In a Circle#

Let random variables \(X\) and \(Y\) have joint density

\[\begin{split} f_{X,Y}(x, y) = \begin{cases} C \sqrt{1 - x^2 - y^2}, & x^2 + y^2 \le 1 \\ 0, & \text{else} \end{cases} \end{split}\]

Find:

  1. constant \(C\), and

  2. marginal densities \(f_X(x)\) and \(f_Y(y)\).

10. Weibull - Gamma#

Let \(X\) be distributed as Weibull \(Wei(r, \lambda)\), \(r > 0\) known, with a density

\[ f(x | \lambda) = \lambda r x^{r-1} \exp \{-\lambda x^r \}, \quad \lambda > 0, \, x \ge 0. \]

Assume a \(Ga(\alpha, \beta)\) prior on \(\lambda\),

\[ \pi(\lambda | \alpha, \beta) = \frac{\beta^\alpha \lambda^{\alpha-1}}{\Gamma(\alpha)} \exp\{-\beta \lambda\}, \quad \lambda \ge 0, \, \alpha, \beta > 0. \]
  1. Is the problem conjugate?

  2. Find the posterior distribution.

  3. If \(X_1 = 5\), \(X_2 = 3\), and \(X_3 = 1\) are observed from \(Wei(2, \lambda)\) and \(\pi(\lambda)\) is gamma \(Ga(3, 5)\), what is the posterior in this case? What are the mean and variance of \(\lambda\) according to this posterior distribution?

11. Normal - Inverse Gamma#

Let \(X \sim N(\mu, \sigma^2)\), with \(\mu\) known, and \(\sigma^2\) is inverse gamma \(IG(\alpha, \beta)\), with a density

\[ \pi(\sigma^2 | \alpha, \beta) = \frac{\beta^\alpha}{\Gamma(\alpha)} (\sigma^2)^{-\alpha-1} \exp \left( -\frac{\beta}{\sigma^2} \right), \quad \sigma^2 \ge 0, \, \alpha, \beta > 0. \]
  1. Is the problem conjugate?

  2. If the sample \((X_1, \ldots, X_n)\):

[4.3430, 0.5850, 4.4345, 6.2605, 3.9778, 4.4538, 2.3931, 3.5877, 1.4254, 5.0694]

comes from normal \(N(3, \sigma^2)\) distribution, find the posterior distribution of \(\sigma^2\) when the prior is \(IG(0.1, 0.1)\).

12. Uniform - Pareto#

Suppose \(X = (X_1, \ldots, X_n)\) is a sample from \(U(0, \theta)\). Let \(\theta\) have Pareto \(Pa (\theta_0, \alpha)\) distribution. Show that the posterior distribution is \(Pa (\max \{\theta_0, x_1, \ldots, x_n\}, \alpha + n)\).

13. Gamma - Inverse Gamma#

Let \(X \sim Ga \left(\frac{n}{2}, \frac{1}{2\theta}\right)\), so that \(X/\theta\) is \(\chi^2_n\). Let \(\theta \sim IG(\alpha, \beta)\). Show that the posterior is \(IG\left(\frac{n}{2} + \alpha, \frac{x}{2} + \beta\right)\).

14. Negative Binomial - Beta#

If \(X = (X_1, \ldots, X_n)\) is a sample from \(NB(m, \theta)\) and \(\theta \sim Be(\alpha, \beta)\), show that the posterior for \(\theta\) is beta \(Be \left(\alpha + mn, \beta + \sum_{i=1}^n x_i\right)\).

15. Horse-Kick Fatalities and Gamma Prior#

During the latter part of the nineteenth century, Prussian officials gathered information on the hazards that horses posed to cavalry soldiers. Fatal accidents for 10 cavalry corps were collected over a period of 20 years (Preussischen Statistik). The number of fatalities due to kicks, \(x\), was recorded for each year and each corps. The table below shows the distribution of \(x\) for these 200 “corps-years.”

\(x\) (number of deaths)

Observed number of corps-years

0

109

1

65

2

22

3

3

4

1

Total

200

Altogether there were 122 fatalities (\(109(0) + 65(1) + 22(2) + 3(3) + 1(4)\)), meaning that the observed fatality rate was \(\frac{122}{200}\), or 0.61 fatalities per corps-year. Von Bortkiewicz (1898) proposed a Poisson model for \(x\) with a rate \(\lambda\).

If the prior on \(\lambda\) is gamma \(Ga(5, 9)\), find the posterior for \(\lambda\). What are the posterior mean and variance?

16. Counts of Alpha#

Rutherford and Geiger (Rutherford, Geiger and Bateman, Phil. Mag. 20, 698, 19) provided counts of α-particles in their experiment. The counts, given in the table below, can be well–modeled by the Poisson distribution:

Particle ct.

0

1

2

3

4

5

6

7

8

9

10

11

\(\ge\)12

Interval ct.

57

203

383

525

532

408

273

139

45

27

10

4

2

Theoretical

54

210

407

525

508

394

254

140

68

29

11

4

6

  1. Find sample size n and sample mean \(\bar{X}\). In calculations for \(\bar{X}\) take ≥ 12 as 12.

  2. Elicit a gamma prior for λ with rate parameter β = 10 and shape parameter α selected in such a way that the prior mean is 7.

  3. Find the Bayes estimator of λ using the prior from (1). Is the problem conjugate? Use the fact that \(\sum_{i=1}^n X_i \sim Poi(n\lambda)\).

  4. Write a script that simulates the Bayes estimator for λ and compare its output with the analytic solution from (3).

17. Estimating Chemotherapy Response Rates#

An oncologist believes that \(90\%\) of cancer patients will respond to a new chemotherapy treatment and that it is unlikely that this proportion will be below \(80\%\). Elicit a beta prior that models the oncologist’s beliefs. During the trial, with 30 patients treated, 22 responded.

Hint: For elicitation of the prior use \(\mu = 0.9\), \(\mu − 2\sigma = 0.8\) and expressions for \(\mu\) and \(\sigma\) for beta.

  1. What are the likelihood and posterior distributions?

  2. Plot the prior, likelihood, and posterior in a single figure.

  3. Using a PPL, find the Bayes estimator of the response rate and compare it to the posterior mean.

Hide code cell content
import numpy as np
import matplotlib.pyplot as plt
from scipy.stats import beta, binom

xx = np.linspace(0, 1, 1000)

prior_alpha, prior_beta = 31.5, 3.5
post_alpha, post_beta = 53.5, 11.5

prior_densities = beta.pdf(xx, prior_alpha, prior_beta)
posterior_densities = beta.pdf(xx, post_alpha, post_beta)

n = 30  # trials
k = 22  # successes
p = xx
likelihood = binom.pmf(k, n, p)

# Normalize the likelihood for comparison
likelihood /= np.max(likelihood)

# Plotting the prior, likelihood and posterior
fig, axs = plt.subplots(3, 1, figsize=(12, 8))
fig.suptitle("Chemotherapy Response Rate")

# Plot prior
prior_mean = prior_alpha / (prior_alpha + prior_beta)
axs[0].plot(xx, prior_densities, label="Prior")
axs[0].axvline(x=prior_mean, color="g", linestyle="--")
axs[0].text(
    prior_mean + 0.01,
    y=np.max(prior_densities) * 0.5,
    s=f"{prior_mean:.3f}",
    color="g",
    ha="left",
)
axs[0].text(
    x=0.02,
    y=np.max(prior_densities) * 0.85,
    s=f"Be({prior_alpha}, {prior_beta}) prior",
    color="black",
)
axs[0].set_ylabel("Density")
axs[0].set_xlim(0, 1)

# Plot likelihood
mle = k / n
axs[1].plot(xx, likelihood, label="Likelihood")
axs[1].axvline(x=mle, color="g", linestyle="--")
axs[1].text(
    x=mle + 0.01,
    y=np.max(likelihood) * 0.5,
    s=f"{mle:.3f}",
    color="g",
    ha="left",
)
axs[1].text(
    x=0.02,
    y=np.max(likelihood) * 0.85,
    s=f"Bin(n={n}, x={k}, p)",
    color="black",
)
axs[1].set_ylabel("Density")
axs[1].set_xlim(0, 1)

# Plot posterior
post_mean = post_alpha / (post_alpha + post_beta)
axs[2].plot(xx, posterior_densities, label="Posterior")
axs[2].axvline(x=post_mean, color="g", linestyle="--")
axs[2].text(
    x=post_mean + 0.01,
    y=np.max(posterior_densities) * 0.5,
    s=f"{post_mean:.3f}",
    color="g",
    ha="left",
)
axs[2].text(
    x=0.02,
    y=np.max(posterior_densities) * 0.85,
    s=f"Be({post_alpha}, {post_beta}) posterior",
    color="black",
)
axs[2].set_ylabel("Density")
axs[2].set_xlabel("Probability of Success")
axs[2].set_xlim(0, 1)


plt.savefig("../images/q17_plot.png")
../_images/0d86a03c63b4a31c9e9b511cab0c0d1e2a64f9d2b9ccfd81b7a547e84da9c403.png
Hide code cell content
import pymc as pm
import arviz as az

with pm.Model():
    prior_p = pm.Beta("p", 31.5, 3.5)
    pm.Binomial("likelihood", n=n, p=prior_p, observed=k)

    trace = pm.sample(3000)

az.summary(trace)
Auto-assigning NUTS sampler...
Initializing NUTS using jitter+adapt_diag...
Multiprocess sampling (4 chains in 4 jobs)
NUTS: [p]
100.00% [16000/16000 00:00<00:00 Sampling 4 chains, 0 divergences]
Sampling 4 chains for 1_000 tune and 3_000 draw iterations (4_000 + 12_000 draws total) took 1 seconds.
mean sd hdi_3% hdi_97% mcse_mean mcse_sd ess_bulk ess_tail r_hat
p 0.823 0.047 0.735 0.907 0.001 0.0 4950.0 7351.0 1.0

18. Jeremy and Variance from Single Observation#

Jeremy believes that his IQ test scores have a normal distribution with mean 110 and unknown variance \(\sigma^2\). He takes a test and scores \(X = 98\).

  1. Show that inverse gamma prior \(IG(\alpha, \beta)\) is the conjugate for \(\sigma^2\) if the observation \(X\) is normal \(N(\mu, \sigma^2)\) with \(\mu\) known. What is the posterior?

  2. Find a Bayes estimator of \(\sigma^2\) and its standard deviation in Jeremy’s model if the prior on \(\sigma^2\) is an inverse gamma \(IG(3, 100)\).

    • Hint: Random variable \(Y\) is said to have an inverse gamma \(IG(\alpha, \beta)\) distribution if its density is given by:

      \[ f(y | \alpha, \beta) = \frac{\beta^\alpha}{\Gamma(\alpha)} y^{-\alpha-1} \exp \left( -\frac{\beta}{y} \right), \quad \alpha, \beta > 0. \]

      The mean of \(Y\) is \(E(Y) = \frac{\beta}{\alpha-1}\), \(\alpha > 1\) and the variance is \(Var(Y) = \frac{\beta^2}{(\alpha-1)^2(\alpha-2)}\), \(\alpha > 2\).

  3. Use WinBUGS to solve this problem and compare the MCMC approximations with exact values from (2).

    • Hint: Express the likelihood in terms of precision \(\tau\) with gamma \(Ga(\alpha, \beta)\) prior, but then calculate and monitor \(\sigma^2 = \frac{1}{\tau}\).

Hide code cell content
import pymc as pm
import arviz as az

x = 98
mu = 110

with pm.Model() as m:
    sigma2 = pm.InverseGamma("sigma2", 3, 100)

    pm.Normal("likelihood", mu, sigma=pm.math.sqrt(sigma2), observed=x)

    trace = pm.sample(10000)

az.summary(trace)
Auto-assigning NUTS sampler...
Initializing NUTS using jitter+adapt_diag...
Multiprocess sampling (4 chains in 4 jobs)
NUTS: [sigma2]
100.00% [44000/44000 00:01<00:00 Sampling 4 chains, 0 divergences]
Sampling 4 chains for 1_000 tune and 10_000 draw iterations (4_000 + 40_000 draws total) took 2 seconds.
mean sd hdi_3% hdi_97% mcse_mean mcse_sd ess_bulk ess_tail r_hat
sigma2 69.07 58.186 15.007 149.345 0.577 0.408 13099.0 14720.0 1.0

19. Easy 2D#

Joint distribution for \(X\) and \(\Theta\) is

\[ f(x, \theta) = 2x, \quad 0 \le x \le 1, \, 0 \le \theta \le 1. \]

Find:

  1. Marginal distributions \(m_X(x)\) and \(m_\Theta(\theta)\).

  2. Conditional distributions \(f(x | \theta)\) and \(\pi(\theta | x)\).

  3. \(P(\Theta^2 \le X \le \Theta)\).

  4. \(P(X^2 \le \Theta \le X)\).

Hint: Because of the factorization theorem, \(\Theta\) and \(X\) are independent and you should obtain \(f(x | \theta) = m_X(x)\) and \(\pi(\theta | x) = m_\Theta(\theta)\).

For (3) and (4) conditioning is needed. For example, for random variables \(X\), \(Y\) and functions \(g_1\), \(g_2\):

\[\begin{align*} P(g_1(Y) \le X \le g_2(Y)) &= E \left[P(g_1(Y) \le X \le g_2(Y)) \mid Y\right] \\ &= \int P(g_1(y) \le X \le g_2(y)) f_Y(y) \, dy \\ &= \int \left[F_X(g_2(y)) - F_X(g_1(y))\right] f_Y(y) \, dy. \end{align*}\]

20. Bayes and Bats#

By careful examination of sound and film records it is possible to measure the distance at which a bat first detects an insect. The measurements are modeled by a normal distribution \(N(\theta, 10^2)\), where \(\theta\) is the unknown mean distance (in cm).

Experts believe that the prior suitably expressing uncertainty about \(\theta\) is \(\theta \sim N(50, 10^2)\). Three measurements are obtained: 62, 52, and 68.

  1. Find the posterior distribution of \(\theta\) given these observations.

  2. Test the hypothesis, \(H_0\), that \(\theta \gt 50\) in a Bayesian fashion.

  3. What is the 95% credible set for \(\theta\)?

For 2 and 3, use the posterior obtained in 1.

Hide code cell content
from scipy.optimize import fsolve
from scipy.stats import norm

posterior = norm(58, 5)
print(f"Probability of H_0: {1 - posterior.cdf(50):.4f}")  # 0.9452

alpha = 0.05
print(
    f"Equi-tailed credible set: ({posterior.ppf(alpha/2):.4f}, {posterior.ppf(1 - alpha/2):.4f})"  # 'Equi-tailed credible set: (48.2002, 67.7998)'
)

guess_lwr, guess_upr = 50, 70


def conditions(x, posterior, alpha):
    lwr, upr = x

    cond_1 = posterior.pdf(upr) - posterior.pdf(lwr)
    cond_2 = posterior.cdf(upr) - posterior.cdf(lwr) - (1 - alpha)

    return cond_1, cond_2


# since this distribution is symmetrical, HPD should be the same as EQT
hpd = fsolve(conditions, (guess_lwr, guess_upr), args=(posterior, alpha))
print(
    f"HPD credible set: ({hpd[0]:.4f}, {hpd[1]:.4f})"
)  # 'HPD credible set: (48.2002, 67.7998)'