import arviz as az
import numpy as np
import pymc as pm
%load_ext lab_black
Counts of Alpha Particles*#
From the Unit 4.3 supplementary exercises.
Warning
This page has the answer to a supplementary exercise! If you want to try the problem without seeing the answer, don’t read any further!
Rutherford and Geiger (Rutherford, Geiger and Bateman, Phil. Mag. 20, 698, 19) provided counts of α-particles in their experiment. The counts, given in the table below, can be well–modeled by the Poisson distribution:
Particle ct. |
0 |
1 |
2 |
3 |
4 |
5 |
6 |
7 |
8 |
9 |
10 |
11 |
\(\ge\)12 |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|
Interval ct. |
57 |
203 |
383 |
525 |
532 |
408 |
273 |
139 |
45 |
27 |
10 |
4 |
2 |
Theoretical |
54 |
210 |
407 |
525 |
508 |
394 |
254 |
140 |
68 |
29 |
11 |
4 |
6 |
Find sample size n and sample mean \(\bar{X}\). In calculations for \(\bar{X}\) take ≥ 12 as 12.
The sample size is the sum of the number of intervals here. Then the sample mean is the total particle count divided by sample size. I tracked down the original source of the data, which you can find on page 701 here. The X data here is the particle count, while y is the number of 7.5-second intervals where that count was observed. The professor’s example cuts off at \( \ge\) 12 particles, but the original data went to 14 particles.
Elicit a gamma prior for λ with rate parameter β = 10 and shape parameter α selected in such a way that the prior mean is 7.
The Gamma distribution mean is \(\frac{\alpha}{\beta}\). So we need a \(\alpha\) of 70 for the mean to equal 7.
Find the Bayes estimator of λ using the prior from (1). Is the problem conjugate? Use the fact that \(\sum_{i=1}^n X_i \sim Poi(n\lambda)\).
The Bayes estimator we’re looking for is the mean of the posterior distribution. If our prior is \(Ga(70, 10)\) and our likelihood is \(Pois(n\lambda)\), the posterior is proportional to:
which is \(Gamma(y + \alpha, n + \beta)\). So the Bayes estimator is:
Write a WinBUGS script that simulates the Bayes estimator for λ and compare its output with the analytic solution from (3).
Pymc version below.
freq = np.array([57, 203, 383, 525, 532, 408, 273, 139, 45, 27, 10, 4, 2])
X = np.arange(13)
n = np.sum(freq)
sum_x = np.sum(X * freq)
α = 70
β = 10
with pm.Model() as m:
λ = pm.Gamma("λ", alpha=α, beta=β)
nlambda = n * λ
pm.Poisson("y", nlambda, observed=sum_x)
trace = pm.sample(2000)
Show code cell output
Auto-assigning NUTS sampler...
Initializing NUTS using jitter+adapt_diag...
Multiprocess sampling (4 chains in 4 jobs)
NUTS: [λ]
Sampling 4 chains for 1_000 tune and 2_000 draw iterations (4_000 + 8_000 draws total) took 1 seconds.
az.summary(trace)
mean | sd | hdi_3% | hdi_97% | mcse_mean | mcse_sd | ess_bulk | ess_tail | r_hat | |
---|---|---|---|---|---|---|---|---|---|
λ | 3.881 | 0.038 | 3.812 | 3.957 | 0.001 | 0.0 | 3413.0 | 4795.0 | 1.0 |
%load_ext watermark
%watermark -n -u -v -iv -p pytensor
Last updated: Wed Mar 22 2023
Python implementation: CPython
Python version : 3.11.0
IPython version : 8.9.0
pytensor: 2.10.1
arviz: 0.15.1
numpy: 1.24.2
pymc : 5.1.2