{
"cells": [
{
"cell_type": "code",
"execution_count": 1,
"id": "f5199f38",
"metadata": {
"tags": [
"hide-output"
]
},
"outputs": [],
"source": [
"import arviz as az\n",
"import pymc as pm\n",
"from pymc.math import log, sqr\n",
"\n",
"%load_ext lab_black"
]
},
{
"cell_type": "markdown",
"id": "a3dab390",
"metadata": {},
"source": [
"# The Zero Trick and Custom Likelihoods*\n",
"\n",
"## Zero-trick Jeremy\n",
"\n",
"This introduces the \"zero trick\", which is a method for specifying custom likelihoods in BUGS. For a more detailed treatment of these methods, see {cite:t}```ntzoufras2009bayesian```, page 276, which is where I got this explanation. These tricks are unnecessary in PyMC since we can just define custom distributions directly, but they do seem to work.\n",
"\n",
"Adapted from [Unit 6: zerotrickjeremy.odc](https://raw.githubusercontent.com/areding/6420-pymc/main/original_examples/Codes4Unit6/zerotrickjeremy.odc)."
]
},
{
"cell_type": "markdown",
"id": "0fbf22cd-65e7-400e-9f60-0bc3ed9c3799",
"metadata": {},
"source": [
"Here's the model we're using:\n",
"\n",
"$$\n",
"\\begin{align*}\n",
"X_i \\mid \\theta &\\sim N(\\theta, 80) \\\\\n",
"\\theta &\\sim N(110, 120)\n",
"\\end{align*}\n",
"$$\n"
]
},
{
"cell_type": "markdown",
"id": "2268b05c-3242-4925-bf4c-0e9fcb95a756",
"metadata": {
"jp-MarkdownHeadingCollapsed": true
},
"source": [
"Of course, BUGS can handle this model just fine. But, let's pretend for a moment that there is no built-in normal distribution. The zero trick takes advantages of some properties of the Poisson distribution to recreate an arbitrary likelihood.\n",
"\n",
"Given a log-likelihood of the form $l_i = \\log f(y; \\theta)$,\n",
"\n",
"$$\n",
"\\begin{align*}\n",
"f(y | \\theta) &= \\prod_{i=1}^{n} e^{l_i} && \\text{Log-likelihood to likelihood.}\\\\\n",
"&= \\prod_{i=1}^{n} \\frac{e^{-(-l_i)}(-l_i)^0}{0!} && \\text{Matching the form of the Poisson PMF.} \\\\\n",
"&= \\prod_{i=1}^{n} f_P(0; -l_i) && \\text{Poisson evaluated at zero with mean $-l_i$.}\n",
"\\end{align*}\n",
"$$"
]
},
{
"cell_type": "markdown",
"id": "74d36843-3623-4f7b-bde4-5aec9975ed4f",
"metadata": {
"jp-MarkdownHeadingCollapsed": true
},
"source": [
"–{cite:t}```ntzoufras2009bayesian``` page 276.\n",
"\n",
"But the rate, $\\lambda$, can't be negative. So we need to add a constant, C, to keep that from happening.\n",
" \n",
"The normal log-likelihood is:\n",
"\n",
"$$l_i = -0.5 \\log(2\\pi) - \\log(\\sigma^2) - \\frac{(y_i - \\mu_i)^2}{2\\sigma^2}$$\n",
"\n",
"But the constant terms won't affect the posterior.\n",
"\n",
"Here's the model in PyMC:"
]
},
{
"cell_type": "code",
"execution_count": 8,
"id": "b99d7cd4",
"metadata": {},
"outputs": [],
"source": [
"y = 98\n",
"μ = 110\n",
"σ = 80**0.5\n",
"τ = 120**0.5\n",
"C = 10000\n",
"\n",
"inits = {\"θ\": 100}"
]
},
{
"cell_type": "code",
"execution_count": 3,
"id": "258031fa",
"metadata": {
"tags": [
"hide-output"
]
},
"outputs": [
{
"name": "stderr",
"output_type": "stream",
"text": [
"Auto-assigning NUTS sampler...\n",
"Initializing NUTS using jitter+adapt_diag...\n",
"Multiprocess sampling (4 chains in 4 jobs)\n",
"NUTS: [θ]\n"
]
},
{
"data": {
"text/html": [
"\n",
"\n"
],
"text/plain": [
""
]
},
"metadata": {},
"output_type": "display_data"
},
{
"data": {
"text/html": [
"\n",
"