Simple Bayesian Network/ Sprinkler example

Hi,

I’m trying to build a model where I can add subjective expert information to the usual machine learning model. To get a bet understanding on how to do that with Edward, I wanted to implement the classic Sprinkler causal model from many Bayesian tutorials. However I’m having trouble finding myself in how to do that and I hope someone here could help me.

Do I have to explicitly create a ConditionalDistribution class to create the graph model or is there a easier way? Is creating the inference model much different than the BayesRegression example? Is there any examples of Edward to build causal graph models?

Thanks!

1 Like

Hi There! I was also looking for a way to build that model and do simple inference on it, for instance the p(R=T|G=T).
The way I was able to make it work is this:

rain = edm.Bernoulli(probs=0.2)
p_sprinkler = tf.where(tf.cast(rain, tf.bool), 0.01, 0.4)
sprinkler = edm.Bernoulli(probs=p_sprinkler)
p_grass_wet = tf.where(tf.cast(rain, tf.bool),
                       tf.where(tf.cast(sprinkler, tf.bool), 0.99, 0.8),
                       tf.where(tf.cast(sprinkler, tf.bool), 0.9, 0.00000001))
grass_wet = edm.Bernoulli(probs=p_grass_wet)

# Inference
q_rain = edm.Bernoulli(probs=tf.nn.sigmoid(tf.Variable(tf.random_normal([]))))
ed.get_session()
inf = edi.KLpq({rain: q_rain}, data={grass_wet: tf.constant(1, dtype=tf.int32)})
inf.run(n_samples=50)
print(q_rain.probs.eval())

For the complete working code check:
https://notebooks.azure.com/janislavjankov/libraries/shared/html/edward-sprinkler.ipynb

Not sure if this is the correct way but it’s working.
Another approach would be to just sample [rain, grass_wet] and compute the conditional probability by counting the results, essentially applying the conditional probability formula p(r=t|g=t) = p(r=t, g=t) / p(g=t) but using the approximated (through the sampling) probabilities. But this is not really using any inference algorithms that Edward provides.

3 Likes

@janislavjankov, this is great, thanks!

Could you suggest a way to infer the distribution parameters given lots of training data? For example, I generate many samples of rain, sprinkler, and grass_wet to infer the probabilities of each node. I’m not entirely sure how to do it with Edward/TF? Do you have any suggestions?

Hi @Achtung I’m glad you find this helpful. Not sure I understand your question correctly… if you want to find the conditional probabilities using a lot of sampled data then directly applying the conditional probability formula with the approximated probabilities (I.e. number of samples that were true over all the samples) should give you good results.
Perhaps if you provide a code sample or the data and what you want to infer I could help.

@janislavjankov, sure.

Given data like [{ rain = 1, sprinkler = 1, grass_wet = 1 }, {rain = 1, sprinkler = 0, grass_wet =0}, …], how do we infer the probability of it raining?

In the code sample, the probability was hard coded to 0.2. However, I want to be able to infer this parameter (and/or other parameters such as p_grass_wet). In particular, I want it to be based on multiple observations. In the example you’ve given, there is one observation that grass_wet = 1.

I had attempted this earlier with this post (Simple Hierarchical Distribution), but I think I was quite confused in several respects. In particular, I wasn’t sure how to plug in a lot of data and be able to do inference.

@Achtung I uploaded the notebook here: https://notebooks.azure.com/janislavjankov/libraries/shared/html/edward-sprinkler-inf.ipynb
Snippet for the inference part:

n_batches = samples.shape[0] // N
grass_wet_ph = tf.placeholder(tf.int32, shape=[N])
inf = edi.KLqp({prob_grass_wet: q_prob_grass_wet}, data={grass_wet: grass_wet_ph})
inf.initialize(n_iter=100 * n_batches, n_samples=10, scale={grass_wet: n_batches})
sess.run(tf.global_variables_initializer())
for i in range(inf.n_iter):
    data_index = i * N % samples.shape[0]
    info_dict = inf.update(feed_dict={grass_wet_ph: samples[data_index: data_index + N, 2]})
    inf.print_progress(info_dict)

I hope you can use this as a starting point.

2 Likes

@janislavjankov, this is awesome! Exactly what I was looking for. Thanks a lot!

The link is broken. Could you please share the code on github or elsewhere? Thanks for your help.

which algorithm of the inference
because the other inference python lib like pyro we can set algorithm.

like optimizer = tf.train.AdamOptimizer(learning_rate=FLAGS.lr)
but in the example code i don’t find any.