import pymc as pm
import numpy as np
import arviz as az
from pymc.math import invlogit
1. Lister*#
Adapted from Unit 10: lister.odc.
Problem Statement#
Sir Joseph Lister (1827–1912), Professor of Surgery at Glasgow University, was influenced by Pasteur’s ideas. He discovered that wounds wrapped in bandages treated with Carbolic acid (phenol) often did not become infected.
Here are Lister’s data on treating open fractures and amputations:
Period |
Carbolic Acid Used |
Results |
---|---|---|
1864–1866 |
No |
Treated 34 patients: 15 died, 19 recovered |
1867–1870 |
Yes |
Treated 40 patients: 6 died, 34 recovered |
Estimate and interpret the risk difference, risk ratio, and odds ratio.
Notes#
The results are a bit different than BUGS. I’m not sure if that’s from the slightly different prior (WishartBartlett vs Wishart), or something else I’ve missed.
cepn_untreated = 34
n_untreated_fail = 15
n_treated = 40
n_treated_fail = 6
mu0 = np.array([-0.5, -1.5])
S = np.array([[0.1, 0], [0, 0.1]])
S_init = np.array([[1, 0], [0, 1]])
with pm.Model() as m:
tau = pm.WishartBartlett("tau", S=S, nu=4, initval=S_init)
sigma = pm.Deterministic("sigma", pm.math.matrix_inverse(tau))
mu = pm.MvNormal("mu", mu=mu0, tau=tau, shape=2)
p1 = invlogit(mu[0])
p2 = invlogit(mu[1])
pm.Binomial("y_untreated", p=p1, n=n_untreated, observed=n_untreated_fail)
pm.Binomial("y_treated", p=p2, n=n_treated, observed=n_treated_fail)
pm.Deterministic("rd", p1 - p2)
pm.Deterministic("rr", p1 / p2)
pm.Deterministic("or", (p1 / (1 - p1)) / (p2 / (1 - p2)))
trace = pm.sample(3000, target_accept=0.95)
Show code cell output
Added new variable tau_c to model diagonal of Wishart.
Added new variable tau_z to model off-diagonals of Wishart.
Auto-assigning NUTS sampler...
Initializing NUTS using jitter+adapt_diag...
Multiprocess sampling (4 chains in 4 jobs)
NUTS: [tau_c, tau_z, mu]
100.00% [16000/16000 00:07<00:00 Sampling 4 chains, 0 divergences]
Sampling 4 chains for 1_000 tune and 3_000 draw iterations (4_000 + 12_000 draws total) took 8 seconds.
az.summary(trace, var_names=["~tau"], filter_vars="like")
mean | sd | hdi_3% | hdi_97% | mcse_mean | mcse_sd | ess_bulk | ess_tail | r_hat | |
---|---|---|---|---|---|---|---|---|---|
mu[0] | -0.253 | 0.349 | -0.896 | 0.408 | 0.003 | 0.003 | 14065.0 | 9108.0 | 1.0 |
mu[1] | -1.788 | 0.443 | -2.639 | -0.983 | 0.004 | 0.003 | 11314.0 | 7297.0 | 1.0 |
sigma[0, 0] | 43.124 | 837.373 | 0.576 | 86.710 | 8.108 | 5.733 | 8565.0 | 6194.0 | 1.0 |
sigma[0, 1] | 6.524 | 489.723 | -47.465 | 51.318 | 5.054 | 3.574 | 11237.0 | 7516.0 | 1.0 |
sigma[1, 0] | 6.524 | 489.723 | -47.465 | 51.318 | 5.054 | 3.574 | 11237.0 | 7516.0 | 1.0 |
sigma[1, 1] | 40.100 | 491.767 | 0.521 | 81.160 | 6.329 | 4.475 | 9202.0 | 4896.0 | 1.0 |
rd | 0.287 | 0.100 | 0.101 | 0.470 | 0.001 | 0.001 | 13492.0 | 8897.0 | 1.0 |
rr | 3.348 | 1.643 | 1.106 | 6.191 | 0.018 | 0.014 | 11531.0 | 7110.0 | 1.0 |
or | 5.469 | 3.534 | 1.105 | 11.696 | 0.037 | 0.028 | 12107.0 | 8051.0 | 1.0 |
%load_ext watermark
%watermark -n -u -v -iv -p pytensor
Last updated: Mon Nov 13 2023
Python implementation: CPython
Python version : 3.11.5
IPython version : 8.15.0
pytensor: 2.17.1
numpy: 1.25.2
pymc : 5.9.0
arviz: 0.16.1