The sprinkler example is really setting "Static" probability values. In this line:
p_G = mc.Lambda('p_G', lambda S=S, R=R: pl.where(S, pl.where(R, .99, .9), pl.where(R, .8, 0.)),
doc='Pr[G|S,R]')
to my understanding, i think we would require to set learn one parameter for each value of the parent. so if we want to learn P(Z/X,Y), we will need for each combination of values of X and Y, learn one parameter set for Z. so lets say X and Y take boolean values and Z is a bernoulli distribution. for each value of (X,Y) , ie: (0,0),(0,1),(1,0),(1,1) we have parameters, p1,p2,p3,p4. And then Z has 4 pymc observed variables: Z1 with parameter p1, Z2 with parameter p2 , Z3 with parameter p3 and Z4 with parameter p4. Thus:
P(Z=0/X=0,Y=0) is the mcmc estimated mean of p1.
P(Z=1/X=0,Y=0) = 1-p1
P(Z=0/X=1,Y=0) = p2 and so on....
I have a related question here: How to use pymc to parameterize a probabilistic graphical model?