Question

If pymc implements the Metropolis-Hastings algorithm to come up with samples from the posterior density over the parameters of interest, then in order to decide whether to move to the next state in the markov chain it must be able to evaluate something proportional to the posterior density for all given parameter values.

The posterior density is proportion to the likelihood function based on the observed data times the prior density.

How are each of these represented within pymc? How does it calculate each of these quantities from the model object?

I wonder if anyone can give me a high level description of the approach or point me to where I can find it.

Was it helpful?

Solution

To represent the prior, you need an instance of the Stochastic class, which has two primary attributes:

value : the variable's current value
logp : the log probability of the variable's current value given the values of its parents

You can initialize a prior with the name of the distribution you are using.

To represent the likelihood, you need a so-called Data Stochastic. That is, an instance of class Stochastic whose observed flag is set to True. The value of this variable cannot be changed and it will not be sampled. Again, you can initialize the likelihood with the name of the distribution you are using (but don't forget to set the observed flag to True).

Say we have the following setup:

import pymc as pm
import numpy as np
import theano.tensor as t

x = np.array([1,2,3,4,5,6])
y = np.array([0,1,0,1,1,1])

We can run a simple logistic regression with the following:

with pm.Model() as model:
    #Priors
    b0 = pm.Normal("b0", mu=0, tau=1e-6)
    b1 = pm.Normal("b1", mu=0, tau=1e-6)
    #Likelihood
    z = b0 + b1 * x
    yhat = pm.Bernoulli("yhat", 1 / (1 + t.exp(-z)), observed=y)
    # Sample from the posterior
    trace = pm.sample(10000, pm.Metropolis())

Most of the above came from Chris Fonnesbeck's iPython notebook here.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top