import arviz as az
import numpy as np
import pymc as pm

%load_ext lab_black

Counts of Alpha Particles*#

From the Unit 4.3 supplementary exercises.

Warning

This page has the answer to a supplementary exercise! If you want to try the problem without seeing the answer, don’t read any further!

Rutherford and Geiger (Rutherford, Geiger and Bateman, Phil. Mag. 20, 698, 19) provided counts of α-particles in their experiment. The counts, given in the table below, can be well–modeled by the Poisson distribution:

Particle ct.

0

1

2

3

4

5

6

7

8

9

10

11

\(\ge\)12

Interval ct.

57

203

383

525

532

408

273

139

45

27

10

4

2

Theoretical

54

210

407

525

508

394

254

140

68

29

11

4

6

  1. Find sample size n and sample mean \(\bar{X}\). In calculations for \(\bar{X}\) take ≥ 12 as 12.

The sample size is the sum of the number of intervals here. Then the sample mean is the total particle count divided by sample size. I tracked down the original source of the data, which you can find on page 701 here. The X data here is the particle count, while y is the number of 7.5-second intervals where that count was observed. The professor’s example cuts off at \( \ge\) 12 particles, but the original data went to 14 particles.

\[\bar{X} = 10094/2608 = 3.8704\]
  1. Elicit a gamma prior for λ with rate parameter β = 10 and shape parameter α selected in such a way that the prior mean is 7.

The Gamma distribution mean is \(\frac{\alpha}{\beta}\). So we need a \(\alpha\) of 70 for the mean to equal 7.

  1. Find the Bayes estimator of λ using the prior from (1). Is the problem conjugate? Use the fact that \(\sum_{i=1}^n X_i \sim Poi(n\lambda)\).

The Bayes estimator we’re looking for is the mean of the posterior distribution. If our prior is \(Ga(70, 10)\) and our likelihood is \(Pois(n\lambda)\), the posterior is proportional to:

\[\lambda^y e^{-n\lambda} \lambda^{\alpha -1}e^{-\beta \lambda}\]
\[\lambda^{y + \alpha -1} e^{-(n + \beta)\lambda}\]

which is \(Gamma(y + \alpha, n + \beta)\). So the Bayes estimator is:

\[\frac{y + \alpha}{n + \beta} = \frac{10094 + 70}{2608 + 10} = 3.8824\]
  1. Write a WinBUGS script that simulates the Bayes estimator for λ and compare its output with the analytic solution from (3).

Pymc version below.

freq = np.array([57, 203, 383, 525, 532, 408, 273, 139, 45, 27, 10, 4, 2])
X = np.arange(13)
n = np.sum(freq)
sum_x = np.sum(X * freq)

α = 70
β = 10

with pm.Model() as m:
    λ = pm.Gamma("λ", alpha=α, beta=β)
    nlambda = n * λ
    pm.Poisson("y", nlambda, observed=sum_x)

    trace = pm.sample(2000)
Hide code cell output
Auto-assigning NUTS sampler...
Initializing NUTS using jitter+adapt_diag...
Multiprocess sampling (4 chains in 4 jobs)
NUTS: [λ]
100.00% [12000/12000 00:00<00:00 Sampling 4 chains, 0 divergences]
Sampling 4 chains for 1_000 tune and 2_000 draw iterations (4_000 + 8_000 draws total) took 1 seconds.
az.summary(trace)
mean sd hdi_3% hdi_97% mcse_mean mcse_sd ess_bulk ess_tail r_hat
λ 3.881 0.038 3.812 3.957 0.001 0.0 3413.0 4795.0 1.0
%load_ext watermark
%watermark -n -u -v -iv -p pytensor
Last updated: Wed Mar 22 2023

Python implementation: CPython
Python version       : 3.11.0
IPython version      : 8.9.0

pytensor: 2.10.1

arviz: 0.15.1
numpy: 1.24.2
pymc : 5.1.2