Basics of Graphs / Flow Control


#1

Hello,
I have a couple of questions regarding flow control in tensor flow/ Edward.

  1. I define the following random variables:

    z = Poisson(tf.ones(1))
    x = Gamma(concentration=tf.ones(1), rate=z)
    y = Normal(loc=x + tf.ones(1), scale=0.005 * tf.ones(1))

After starting a session, I sample from them.

sess = ed.get_session()
[z.sample().eval(), x.sample().eval(), y.sample().eval()]

a) Every time I sample a variable the entire graph is traverse until that node and the value is returned but the intermediate values are not stored. Is this correct?
b) is there a way of sampling the entire graph once and report the value of every node at that particular sample?

  1. Now, I redefine the model introducing flow control. Depending on the value of a previous variable, I sample from a Gamma distribution or from a Gaussian:

    z = Poisson(tf.ones(1))
    zp = tf.multiply(tf.constant(1, dtype=tf.float32), z)
    with tf.control_dependencies([zp]):
         comparison = tf.equal(zp, tf.zeros(1))
         with tf.control_dependencies([comparison]):
             def if_false():
                    y = Gamma(concentration=tf.ones(1), rate=zp)
                    return y
             def if_true():
                    x = Normal(loc=zp, scale=0.005 * tf.ones(1))
                    return x
         new_rv = tf.where(comparison, if_true(), if_false())
    

a) Is this the most compact way of writing this code?
b) in this case, I can sample from z and new_rv but not from the intermediate variables. Again, is there any way of accessing all the node values from a sample?

Thanks !


#2

Intermediate values are not stored unless you query them in the same session run. You can do so, for example, with

sess.run([z, x, y])

I don’t think you need the tf.control_dependencies. The following works:

z = Poisson(tf.ones(1))
zp = 1.0 * z
comparison = tf.equal(zp, tf.zeros(1))
def if_false():
  y = Gamma(concentration=tf.ones(1), rate=zp)
  return y
def if_true():
  x = Normal(loc=zp, scale=0.005 * tf.ones(1))
  return x
new_rv = tf.where(comparison, if_true, if_false)

See previous answer for tensors you have access to. For values inside the body of a function, things are a bit more difficult. You’d have to get them to be outputted to a tensor in some way.


Saving ancestor variables in ancestral sampling