Iterative estimators ("bayes filters") in Edward?

The assumption is that X and y are random variables with shape compatible with the values they’re binded to. For linear models, I guess you don’t actually need X. More concretely it would be

X = tf.placeholder(tf.float32, [None, D])
y_ph = tf.placeholder(tf.float32, [None])

w = Normal(loc=tf.zeros(D), scale=tf.ones(D))
b = Normal(loc=tf.zeros(1), scale=tf.ones(1))
y = Normal(loc=ed.dot(X, w) + b, scale=1.0)

inference = ed.KLqp({w: qw, b: qb}, data={y: y_ph})

If you know the batch size you can set the Nones in the placeholders at some fixed value.

Then you would feed in

inference.update({X: X_batch, y_ph: y_batch})
1 Like